PolarIced
Apr 6, 10:09 AM
Has Intel R&D come up with a new, low-power, backlit keyboard? ;)
(Figured I'd throw that out straight off, as it's bound to come up somewhere along the line)
(Figured I'd throw that out straight off, as it's bound to come up somewhere along the line)
Macnoviz
Jul 21, 04:20 PM
I'm not ripping DVDs. I'm ripping DVD IMAGES made with Toast from EyeTV2 Digital SD and HD recordings to archive off air broadcast recordings for my personal use only. Nothing to do with seeding anything to anyone. Need more cores to encode and rip simultaneously instead of sequentially. Much faster to do a bunch of one or two shows simultaneously than larger sets sequentially. More cores will also allow for faster compacting of the edited shows - IE removal of ads - in the first place.
Oh, so that's why you want Handbrake fourfold, I was going to ask wether you had 4 optical drives.
Oh, so that's why you want Handbrake fourfold, I was going to ask wether you had 4 optical drives.
hobi316
Jun 9, 06:43 AM
RadioShack store manager here and i have some very interesting information if you guys don't already know this. Please quote this as much as possible to get the word out.
How can I check which stores will be carrying the phone on launch day? And also, if I go into a particular store next Tuesday and pre-order, you're saying I will be able to pick that phone up on the 24th?
How can I check which stores will be carrying the phone on launch day? And also, if I go into a particular store next Tuesday and pre-order, you're saying I will be able to pick that phone up on the 24th?
Blue Velvet
Apr 27, 03:19 PM
I tried, I discovered layers.
Fact: There are "layers" if you can even call them that.
Another Fact: They mean nothing.
They're not layers in any common use of the word in design. However, for want of a better word, they're elements. Those looking for them need to view the file in outline mode in Illustrator (Apple Y)
Of course not, they will find something else to argue about.
True... and I'll leave that for others. It was a mistake of mine to look at MR today and be sucked into the stupidity. Now I really must take leave of all of you...
Fact: There are "layers" if you can even call them that.
Another Fact: They mean nothing.
They're not layers in any common use of the word in design. However, for want of a better word, they're elements. Those looking for them need to view the file in outline mode in Illustrator (Apple Y)
Of course not, they will find something else to argue about.
True... and I'll leave that for others. It was a mistake of mine to look at MR today and be sucked into the stupidity. Now I really must take leave of all of you...
slooksterPSV
Aug 7, 02:07 PM
I can't wait till spring for Leopard. That's too long, I want Leopard now :D :D :D come on Steve, give us Leopard!
Gem�tlichkeit
Apr 6, 10:27 AM
This is what I've been waiting for. Apple is about to get a chunk of my bank account lol. Upgrading from an early 2008 MBP
bassfingers
Apr 27, 01:49 PM
Who would think I'd support Bush? He's not conservative enough for me, and his administration spent to much.
How much did government intervene in business affairs during the Roaring 20's? The government has already failed to do what it should do: It should promote the common good. I find it hard to believe that the U.S. Government had this country's best interests at heart when I hear Mrs. Pelosi say that to find out what's in Obamacare, you need to pass it.
I know a lot about alcoholism and codependence because my mother is a nurse who specialized in treating alcoholics and other drug addicts and in counseling them. You don't help an alcoholic by protecting him from the consequences of his actions. The protection can help him make even bigger mistakes. I've seen that happen in many families I know of that include alcoholics. I also know about entitled welfare recipients who abuse social programs by demanding too much from social programs, by getting it, and by defrauding them. I saw the entitlement firsthand when a relative of mine was a landlord who rented houses to welfare recipients. Welfare recipients ruined a house, my relative kept the security deposit, and then the family got the Department of Social Services to put them into a house for twice the rent my relative charged. But the family still had the nerve to complain that my relative had overcharged it.
Good points. All of them.
not sarcasm^
How much did government intervene in business affairs during the Roaring 20's? The government has already failed to do what it should do: It should promote the common good. I find it hard to believe that the U.S. Government had this country's best interests at heart when I hear Mrs. Pelosi say that to find out what's in Obamacare, you need to pass it.
I know a lot about alcoholism and codependence because my mother is a nurse who specialized in treating alcoholics and other drug addicts and in counseling them. You don't help an alcoholic by protecting him from the consequences of his actions. The protection can help him make even bigger mistakes. I've seen that happen in many families I know of that include alcoholics. I also know about entitled welfare recipients who abuse social programs by demanding too much from social programs, by getting it, and by defrauding them. I saw the entitlement firsthand when a relative of mine was a landlord who rented houses to welfare recipients. Welfare recipients ruined a house, my relative kept the security deposit, and then the family got the Department of Social Services to put them into a house for twice the rent my relative charged. But the family still had the nerve to complain that my relative had overcharged it.
Good points. All of them.
not sarcasm^
patrick0brien
Jul 20, 06:39 PM
Actually, that was my point, but now that you mention it, reversed hyperthreading would solve some problems.
In the long run (really long run, I'm talking quantumcomputers here) however, you are right, and innovation in computing will mostly come from software and how you tell the computer what to do. The nec-plus-ultra would be thinking of a result and getting it (or saying it to your computer) like a photoshop user going, well I would like the sun being more dominant in that picture, the power lines removed, and make those persons look younger. Boom. It happens.
-Macnoviz
Woah. Well, there's more than raw computing involved there, there is context for the computer to understand. What is the "sun" what does "Dominant" really mean? What are power lines? What does "remove" really mean? And let's not go into what kind of DB would be needed to describe all of the differences a person's face exhibits over a lifetime!
I'm sure we'll get there and such 'life' DB's built I hope there is a standard set! Who says we don't need this really big drives!
In the long run (really long run, I'm talking quantumcomputers here) however, you are right, and innovation in computing will mostly come from software and how you tell the computer what to do. The nec-plus-ultra would be thinking of a result and getting it (or saying it to your computer) like a photoshop user going, well I would like the sun being more dominant in that picture, the power lines removed, and make those persons look younger. Boom. It happens.
-Macnoviz
Woah. Well, there's more than raw computing involved there, there is context for the computer to understand. What is the "sun" what does "Dominant" really mean? What are power lines? What does "remove" really mean? And let's not go into what kind of DB would be needed to describe all of the differences a person's face exhibits over a lifetime!
I'm sure we'll get there and such 'life' DB's built I hope there is a standard set! Who says we don't need this really big drives!
JakeM.
Aug 7, 06:59 PM
Did anyone else think it was odd that many of the features seemed so poorly presented. We didn't actually get to see anything new in Spotlight. And no new features of the actual Dashboard were even discussed.
It just doesn't seem that Leopard is as far along as Tiger was when previewed even though Leopard is suppose to ship in the spring just as Tiger did.
It just doesn't seem that Leopard is as far along as Tiger was when previewed even though Leopard is suppose to ship in the spring just as Tiger did.
nickXedge
Apr 7, 11:12 PM
Not saying this story is true or false but Best Buy employs non-commissioned based sales staff. There are no quotas to speak of. This is a public company and sales quotas would be accessible to stockholders.
Serves them right. Bastards. It's amazing how easily they sucker people into buying an $80 hdmi cable when they can get a higher quality cable from monoprice for less then five bucks.
I do not intend to be rude, but there is a difference in HDMI cables, no matter what the Internet tells you. Conductors, shielding materials/layers and the way the connectors are put together are a few differentiators. An AudioQuest Coffee cable, for example, which is several hundred dollars ($600 I believe for a 1.5m) is made of pure silver starting with the tips and going the length of the cable. This is not the same as a no name $5 dollar HDMI cable from Amazon.
Serves them right. Bastards. It's amazing how easily they sucker people into buying an $80 hdmi cable when they can get a higher quality cable from monoprice for less then five bucks.
I do not intend to be rude, but there is a difference in HDMI cables, no matter what the Internet tells you. Conductors, shielding materials/layers and the way the connectors are put together are a few differentiators. An AudioQuest Coffee cable, for example, which is several hundred dollars ($600 I believe for a 1.5m) is made of pure silver starting with the tips and going the length of the cable. This is not the same as a no name $5 dollar HDMI cable from Amazon.
11thIndian
Apr 5, 10:14 PM
sorry but that's not the case. While some contend it's jaw-dropping, that's only because they're stacking it up against what FCS is currently. Compared to what Avid and Adobe are doing, Apple now has a mountain to climb. Apple has been too interested in their entertainment business to worry about their "pro" line (hardware/software). I know quite a few studios who have already shifted BACK to Avid and some are taking on the Adobe Suite completely as their software of choice. While some may find the new FCS exciting, and it does have some bells and whistles, it's typical Apple doing an incremental bump to keep up with what others are doing. Sad really.
So if you were one of the 100 people up to now who's seen it and can accurately make this evaluation, let's see your invite....
So if you were one of the 100 people up to now who's seen it and can accurately make this evaluation, let's see your invite....
BWhaler
Jul 14, 03:35 PM
Since apple is part of the Blu Ray consortium wouldn't you think they will use blu ray only?
Not a chance in the near future. Blu Ray and Sony are in utter shambles right now.
Not a chance in the near future. Blu Ray and Sony are in utter shambles right now.
Alx9876
Apr 6, 01:27 PM
What a joke of a tablet. Nothing but a piece of crap.
epitaphic
Aug 18, 11:46 PM
So you think they put an extra processor in across the line just to be able to say they had a quad? Even the AnandTech article you used as a source showed here (http://www.anandtech.com/mac/showdoc.aspx?i=2816&p=18) that PS took advantage of quad cores in Rosetta
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
Your points about latency and FSB are not separate negatives as you have made them. They are redundant theoretical concerns with implications of unclear practical significance.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
I am not worried. Everything anyone has come up with on this issue are taken from that same AnandTech article. Until I see more real-world testing, I will not be convinced. Also, I expect that more pro apps such as PS will be able to utilize quad cores in the near future, if they aren't already doing so. Finally, even if Conroe is faster, Woodcrest is fast enough for me ;).
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
I think you misread that. They were comparing Core 2 Extreme (not Woodcrest) and Conroe to see whether the increased FSB of the former would make much difference.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
Your points about latency and FSB are not separate negatives as you have made them. They are redundant theoretical concerns with implications of unclear practical significance.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
I am not worried. Everything anyone has come up with on this issue are taken from that same AnandTech article. Until I see more real-world testing, I will not be convinced. Also, I expect that more pro apps such as PS will be able to utilize quad cores in the near future, if they aren't already doing so. Finally, even if Conroe is faster, Woodcrest is fast enough for me ;).
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
I think you misread that. They were comparing Core 2 Extreme (not Woodcrest) and Conroe to see whether the increased FSB of the former would make much difference.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
adamfilip
Jul 21, 10:13 AM
Now you just need to decide what color your want your new computer... (again)
I want Apple to take the current PowerMac G5 Case
make it 25% shorter, add a second optical drive
and two more Internal hard drives
add some External Sata ports. and 4 more USB2 ports
1 more front usb2 port
make the mic port powered
and then make the case black anodized aluminum. and have the apple logo on the sides backlit just like the notebooks
I want Apple to take the current PowerMac G5 Case
make it 25% shorter, add a second optical drive
and two more Internal hard drives
add some External Sata ports. and 4 more USB2 ports
1 more front usb2 port
make the mic port powered
and then make the case black anodized aluminum. and have the apple logo on the sides backlit just like the notebooks
ergle2
Sep 15, 12:50 PM
More pedantic details for those who are interested... :)
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
All very true, but beyond that -- if you've ever looked closely VMS and at NT, you'll notice, it's a lot more than just "influenced". The core design was pretty much identical -- the way I/O worked, its interrupt handling, the scheduler, and so on -- they're all practically carbon copies. Some of the names changed, but how things work under the hood hadn't. Since then it's evolved, of course, but you'd expect that.
Quite amusing, really... how a heavyweight enterprise-class OS of the 80's became the desktop of the 00's :)
Those that were around in the dim and distant will recall that VMS and Unix were two of the main competitors in many marketplaces in the 80's and early 90's... and today we have OS X, Linux, FreeBSD, Solaris, etc. vs XP, W2K3 Server and (soon) Vista -- kind of ironic, dontcha think? :)
Of course, there's a lot still running VMS to this very day. I don't think HP wants them to tho' -- they just sent all the support to India, apparently, to a team with relatively little experience...
NT actually started as OS/2 3.0. Its lead architect was OS guru Dave Cutler, who is famous for architecting VMS for DEC, and naturally its design influenced NT. And the N-10 (Where "NT" comes from, "N" "T"en) Intel RISC processor was never intended to be a mainstream product; Dave Cutler insisted on the development team NOT using an X86 processor to make sure they would have no excuse to fall back on legacy code or thought. In fact, the N-10 build that was the default work environment for the team was never intended to leave the Microsoft campus. NT over its life has run on X86, DEC Alpha, MIPS, PowerPC, Itanium, and x64.
IBM and Microsoft worked together on OS/2 1.0 from 1985-1989. Much maligned, it did suck because it was targeted for the 286 not the 386, but it did break new ground -- preemptive multitasking and an advanced GUI (Presentation Manager). By 1989 they wanted to move on to something that would take advantage of the 386's 32-bit architecture, flat memory model, and virtual machine support. Simultaneously they started OS/2 2.0 (extend the current 16-bit code to a 16-32-bit hybrid) and OS/2 3.0 (a ground up, platform independent version). When Windows 3.0 took off in 1990, Microsoft had second thoughts and eventually broke with IBM. OS/2 3.0 became Windows NT -- in the first days of the split, NT still had OS/2 Presentation Manager APIs for it's GUI. They ripped it out and created Win32 APIs. That's also why to this day NT/2K/XP supported OS/2 command line applications, and there was also a little known GUI pack that would support OS/2 1.x GUI applications.
All very true, but beyond that -- if you've ever looked closely VMS and at NT, you'll notice, it's a lot more than just "influenced". The core design was pretty much identical -- the way I/O worked, its interrupt handling, the scheduler, and so on -- they're all practically carbon copies. Some of the names changed, but how things work under the hood hadn't. Since then it's evolved, of course, but you'd expect that.
Quite amusing, really... how a heavyweight enterprise-class OS of the 80's became the desktop of the 00's :)
Those that were around in the dim and distant will recall that VMS and Unix were two of the main competitors in many marketplaces in the 80's and early 90's... and today we have OS X, Linux, FreeBSD, Solaris, etc. vs XP, W2K3 Server and (soon) Vista -- kind of ironic, dontcha think? :)
Of course, there's a lot still running VMS to this very day. I don't think HP wants them to tho' -- they just sent all the support to India, apparently, to a team with relatively little experience...
robbyx
Apr 25, 04:05 PM
This suit has merit. If I turn off location services there should be no record of where I go.
Why would you assume that turning off location services would prevent tracking? The phone is still connected to the cell network. I'd assume Airplane Mode would turn off tracking, but not location services.
With that and other simple info I can find out where you work, where you bank, where you live, what time you usually get home. All it takes is one website or email attachment to compromise your device. This info is not encrypted.
I do think if Any device does this they should be sued
First, someone would have to obtain your phone. No one seems to mention this. Big bad Apple is tracking us all!!! Apple isn't tracking anyone. The phone is logging location information for some reason, perhaps legit, perhaps a bug, perhaps test code that got left behind, who knows. The point is, your location isn't compromised unless someone steals your phone.
And if they steal your phone, they'll have your address book, your web bookmarks, your email, your notes, etc.
Suing over this is idiotic and really shows how absurd this whole "privacy" debate has become. Scott McNealy said it best years ago: "Privacy is dead. Get over it."
Why would you assume that turning off location services would prevent tracking? The phone is still connected to the cell network. I'd assume Airplane Mode would turn off tracking, but not location services.
With that and other simple info I can find out where you work, where you bank, where you live, what time you usually get home. All it takes is one website or email attachment to compromise your device. This info is not encrypted.
I do think if Any device does this they should be sued
First, someone would have to obtain your phone. No one seems to mention this. Big bad Apple is tracking us all!!! Apple isn't tracking anyone. The phone is logging location information for some reason, perhaps legit, perhaps a bug, perhaps test code that got left behind, who knows. The point is, your location isn't compromised unless someone steals your phone.
And if they steal your phone, they'll have your address book, your web bookmarks, your email, your notes, etc.
Suing over this is idiotic and really shows how absurd this whole "privacy" debate has become. Scott McNealy said it best years ago: "Privacy is dead. Get over it."
tortoise
Aug 22, 05:19 PM
The next Xeon is Clovertown, which is just Woodcrest scaled to 4 cores with a few changes in clock and FSB etc. Tigerton comes next, also 4 cores but MP capable (3+ chips possible) and with a possibility of increased FSB speed, bigger L2 cache and so on.
This will likely suck, because the interconnect Intel is using is just too damn slow. Putting four cores in the same package will just make the situation worse, because a lot of applications are significantly limited by memory performance.
The Woodcrest processors have been put through their paces pretty well on the supercomputing lists, and their Achille's heal is the memory subsystem. Current generation AMD Opterons still clearly outscale Woodcrest in real-world memory bandwidth with only two cores. Unless Intel pulls a rabbit out of their hat with their memory architecture issues when the quad core is released, AMDs quad core is going to embarrass them because of the memory bottleneck. And AMD is already starting to work on upgrading their already markedly superior memory architecture.
This will likely suck, because the interconnect Intel is using is just too damn slow. Putting four cores in the same package will just make the situation worse, because a lot of applications are significantly limited by memory performance.
The Woodcrest processors have been put through their paces pretty well on the supercomputing lists, and their Achille's heal is the memory subsystem. Current generation AMD Opterons still clearly outscale Woodcrest in real-world memory bandwidth with only two cores. Unless Intel pulls a rabbit out of their hat with their memory architecture issues when the quad core is released, AMDs quad core is going to embarrass them because of the memory bottleneck. And AMD is already starting to work on upgrading their already markedly superior memory architecture.
rwilliams
Mar 22, 01:13 PM
This is just a preview of the future, Android based tablets will clean the iPads clock. Apple made the so-called iPad 2 as a 1.5. Low res camera, not enough RAM, and low res screen. It's going to be a verrrry long 2012 for Apple. Sure it's selling like hot cakes now, but when buyers see tablets that they don't have to stand inline for, that have better equipment and are cheaper ... Apples house of cards will come crashing down around them.
The only strength that Apple has is the app ecosystem; which is why they are going after Amazon for spiting on the sidewalk. They know the world of hurt coming their way.
Well, you knew it was only a matter of time before this cat showed up.
The only strength that Apple has is the app ecosystem; which is why they are going after Amazon for spiting on the sidewalk. They know the world of hurt coming their way.
Well, you knew it was only a matter of time before this cat showed up.
sittnick
Apr 19, 01:50 PM
BREAKING NEWS --- 1979 --
http://www.thetelemediagroup.com/images/monitors/pg5/3_ab121w.jpghttp://www.thetelemediagroup.com/images/monitors/pg5/4_gebw.jpg
RCA Launches Suit Against General Electric for infringement of 9" b&w television interface and "look and feel."
Spokesmen for RCA maintain that GE's misappropriation of the LīfLīk� Trūwūd� woodgrain finish, leading consumers to confuse the GE imitation with the RCA original.
Also note GE's nearly identical VHF and UHF controllers ... placed in the same location on the chassis as the RCA original. Even the speaker is located in the same way.
RCA patented the use of separate VHF and UHF knobs in 1958, the click-stop UHF knob in 1972, and the ergonomically efficient upper right location of tuner knobs in 1952. The characteristics are innovations that help the consumer recognize an RCA television, and any use of these unique features without RCA's explicit permission is a breach of patent, trademark and copyright.
http://www.thetelemediagroup.com/images/monitors/pg5/3_ab121w.jpghttp://www.thetelemediagroup.com/images/monitors/pg5/4_gebw.jpg
RCA Launches Suit Against General Electric for infringement of 9" b&w television interface and "look and feel."
Spokesmen for RCA maintain that GE's misappropriation of the LīfLīk� Trūwūd� woodgrain finish, leading consumers to confuse the GE imitation with the RCA original.
Also note GE's nearly identical VHF and UHF controllers ... placed in the same location on the chassis as the RCA original. Even the speaker is located in the same way.
RCA patented the use of separate VHF and UHF knobs in 1958, the click-stop UHF knob in 1972, and the ergonomically efficient upper right location of tuner knobs in 1952. The characteristics are innovations that help the consumer recognize an RCA television, and any use of these unique features without RCA's explicit permission is a breach of patent, trademark and copyright.
Ladybug
Aug 7, 06:28 PM
If you were picking on Mail.app's Stationery I'd probably agree with you.
None of the things that Time Machine have been compared to seem even close to what they are planning to do. Including my own VMS file versioning analogies. System Restore is not capable of restoring a single file, and particularly not within a running application. It seems kind of more like a system wide undo function when it comes to files...
B
Norton's GoBack, which was purchased from some other company, has a similar feature for restoring single files. This isn't quite the same thing, but the whole concept isn't entirely new. GoBack was introduced well before Microsoft came out with System Restore... That said, I think its a great feature to include and I'm sure I'll find many uses for it.
None of the things that Time Machine have been compared to seem even close to what they are planning to do. Including my own VMS file versioning analogies. System Restore is not capable of restoring a single file, and particularly not within a running application. It seems kind of more like a system wide undo function when it comes to files...
B
Norton's GoBack, which was purchased from some other company, has a similar feature for restoring single files. This isn't quite the same thing, but the whole concept isn't entirely new. GoBack was introduced well before Microsoft came out with System Restore... That said, I think its a great feature to include and I'm sure I'll find many uses for it.
boncellis
Jul 20, 09:06 AM
I wonder just how Apple would react to news that the next processor update is ahead of schedule. Presumably their plans are carefully laid out, and if a PC competitor can jump on Intel updates faster than they can without having to conform to a similar timeline, then Apple might get burned, if only slightly.
That's one aspect of the transition that I've always wondered about. Apple has often marketed new "products" more than "updates" in the past, but with Intel's speed of development, perhaps Apple will now focus more on updates and minimize redesigning/new releases. I don't think it's bad, just something of a departure from what I've grown accustomed to.
That's one aspect of the transition that I've always wondered about. Apple has often marketed new "products" more than "updates" in the past, but with Intel's speed of development, perhaps Apple will now focus more on updates and minimize redesigning/new releases. I don't think it's bad, just something of a departure from what I've grown accustomed to.
Benjamins
Mar 31, 08:12 PM
HA HA. You have got to be kidding me.
LOL specially those who parade around using Microsoft fanboy as a buffer.
LOL specially those who parade around using Microsoft fanboy as a buffer.
Intarweb
Apr 27, 08:04 AM
I wonder if this is why I can no longer get more than a days charge on my iPhone 4 with minimal use since it seems like it's an always on thing.
No comments:
Post a Comment