02-02-2010, 12:00 AM
|
Executive Editor
Join Date: Aug 2006
Posts: 29,160
|
|
Old World vs. New World Computing
"In the New World, computers are task-centric. We are reading email, browsing the web, playing a game, but not all at once. Applications are sandboxed, then moats dug around the sandboxes, and then barbed wire placed around the moats. As a direct result, New World computers do not need virus scanners, their batteries last longer, and they rarely crash, but their users have lost a degree of freedom. New World computers have unprecedented ease of use, and benefit from decades of research into human-computer interaction. They are immediately understandable, fast, stable, and laser-focused on the 80% of the famous 80/20 rule. Is the New World better than the Old World? Nothing's ever simply black or white." This is a really great "think piece" that's well worth reading if you're the kind of person that likes to think about where computers - and that includes mobile devices - are going to be moving in the next decade. As such, I'm posting it across all our sites to get the widest possible take on the topic. I want to hear from you! Steven Frank, the author, posits that new world computers are task-centric and secure, and that's the future of computing. For many types of scenarios, I think that works really well - but Frank doesn't seem to acknowledge that in order to do anything involving real content creation, an "old world" PC is still required. I'm happy to have a limited-in-functionality Web-pad style device next to my couch for Web surfing, tweeting, etc., but when I need to process raw files on a colour calibrated monitor, or edit HD video? Old world computing rules those scenarios. I also have to wonder how sophisticated the software can get on these New World computers - ever noticed how so many iPod Touch games are the same? Screen size and touch-only inputs are significantly factors on what developers are able to do. In some ways this boils down to the "appliances" vs. "computers" argument that has been going on for years. Appliances are more reliable, but they only do very specific things. Computers do infinitely more, but are generally less reliable than appliances. Which would you rather have, and why?
|
|
|
|
|
02-02-2010, 01:02 AM
|
Mystic
Join Date: Sep 2006
Posts: 1,608
|
|
In real world terms both have their place. I won't have any trouble replacing my unibody MBP with the Apple tablet because I have a 27" iMac to handle the heavy lifting stuff. There is something to be said for a device that only does a few things but does them well without concern (much) for spyware/virus and is rock solid stable. Instant on has been a holy grail for computers for a long time and frankly I don't know why we're not there yet.
__________________
27" iMac 3.06GHz Intel Core 2 Duo 8GB RAM
16GB LTE iPad3, 13" Macbook Air Core i5 w/128GB SSD
iPhone 4S (16GB), AppleTV 2.0
Last edited by Macguy59; 02-02-2010 at 01:10 AM..
|
|
|
|
|
02-02-2010, 01:36 AM
|
Thinker
Join Date: May 2006
Posts: 367
|
|
Of course there's room and need for both. But for most non-geeks or casual users, I'd suspect the New World computer model would be best for them. My wife, who is totally not into anything technological, would probably be a good candidate for New World computing.
I could EASILY replace my netbook with an iPad or even my iPhone (except for screen size) since I tend to use it for things that really don't require a lot of keyboard input (browsing, reading RSS feeds, Bible reading, the like). But for writing and photo and video editing, gimme my desktop machine (currently a 2009 Mac Mini). It's going to be a LONG time before they can find a way to make these type of activities work well on a touch screen device.
__________________
XBox 360 S, 16GB iPhone 4S, iPod Classic 160 GB, Dell Inspiron Mini 1018; Macs: Mac Mini 2.4 GHz 6 GB RAM; Macbook 2.0 GHz 3 GB RAM; MacBook Air 11", 24" Cinema Display
|
|
|
|
|
02-02-2010, 02:05 AM
|
Contributing Editor
Join Date: Feb 2002
Posts: 918
|
|
I think that the only reason why the "New World Computing" can exist effectively is because of cloud computing and wireless access. Without persistent connectivity, the single task devices to handle email, remote controls, etc. would be far more limited in scope.
There still needs to be some balancing, as we'll want devices that can do more and more, and then want greater battery life. The iPad offers a great deal of flexibility at a cost of battery life, while the Kindle, much more limited in scope can go for a considerably longer period without charging. I even remember Microsoft's attempt with their Spot watches, which needed to be charged every day, but offered far more than what a typical watch would do. Perhaps it will be a perpetual see-saw.
But back to cloud computing, as interfaces to our data improve, I think that some form of cloud computing, even if it is our own home networks, will be the result of this change. Imagine being able to set up what you want processed for a video, and then have a home server do all the processing from a remote device, instead of being tied to a desktop!
|
|
|
|
|
02-02-2010, 02:16 AM
|
Mystic
Join Date: Sep 2006
Posts: 1,608
|
|
Quote:
Originally Posted by Hooch Tan
The iPad offers a great deal of flexibility at a cost of battery life, while the Kindle, much more limited in scope can go for a considerably longer period without charging.
|
We'll need to wait and see how Apples battery life numbers hold up under testing but it seems impressive. What would happen to the Kindle's if you added color ? I do get what you're saying though.
__________________
27" iMac 3.06GHz Intel Core 2 Duo 8GB RAM
16GB LTE iPad3, 13" Macbook Air Core i5 w/128GB SSD
iPhone 4S (16GB), AppleTV 2.0
|
|
|
|
|
02-02-2010, 03:06 AM
|
Contributing Editor
Join Date: Feb 2002
Posts: 918
|
|
Quote:
Originally Posted by Macguy59
We'll need to wait and see how Apples battery life numbers hold up under testing but it seems impressive. What would happen to the Kindle's if you added color ? I do get what you're saying though.
|
Actually, assuming that the iPad even gets 8 hours with reasonable usage, that's still quite good. I think the Kindle could get colour without much of a sacrifice, as there are colour e-ink displays, but it'd still be the same slow refresh. Steve Jobs does have a point about who would read for that many hours straight. Sure there are some, but I'm the kind of person who charges everything the moment he gets home, so if it were a choice, I'd go for the iPad over the Kindle. Whether other people would make the same choice? I dunno, but the regular charging is worth it in my opinion.
|
|
|
|
|
02-02-2010, 04:24 AM
|
Mystic
Join Date: Sep 2006
Posts: 1,608
|
|
Quote:
Originally Posted by Hooch Tan
I'd go for the iPad over the Kindle. Whether other people would make the same choice? I dunno, but the regular charging is worth it in my opinion.
|
I don't know if it's much of threat to the regular Kindle but the Kindle DX should be looking over it's chassis
__________________
27" iMac 3.06GHz Intel Core 2 Duo 8GB RAM
16GB LTE iPad3, 13" Macbook Air Core i5 w/128GB SSD
iPhone 4S (16GB), AppleTV 2.0
|
|
|
|
|
02-02-2010, 02:02 PM
|
Intellectual
Join Date: Feb 2002
Posts: 197
|
|
The keywords in the intro quote were "but not all at once" for me. Why is that? It strikes me as it's a backward engineered bow to the iPAD, since it can only do one thing at once (at least for now). The thing is, I like to be able to do at least two things at once a lot of the time. Reading e-mail while watching a video; checking Facebook while waiting for a download; there are a lot times when one task doesn't require constant monitoring. Even on my phone, I want to be able to look up an address while talking to the person that needs it.
There are some NAS boxes that also support Torrents and such, allowing you to start a process and have it continue without being "connected" to it. That may be part of the solution for real usefulness in New World computers.
"New World computers do not need virus scanners, their batteries last longer, and they rarely crash." seems like wishful thinking to me or at least predicated on the "but not all at once" scenario. It seems to me that, as long as you have clever, but morally-deficient people, you'll have to deal with some form of mal-ware. The sandbox and single execution only make it harder.
I do think that we'll eventually move to a dual device mode where the device we carry will have built-in phone capabilities but rely on a client for remote access/display to a much more powerful base device for bigger applications. Obviously, that has to be predicated on fast, nearly faultless, ubiquitous wireless access in order to work. It also requires some major work on the interface problem. Right now the screens are either too bit (iPAD) or too small (most smartphones) to be both carry-able and usable for more than basic needs. (There's only so much resizing and scrolling that's tolerable.) The same is true of keyboards.
The handheld device in the last paragraph could be an appliance computer of some sort (though I still think it needs to multi-task a bit).
The thing about an open client like that is that you could use it equally well and simultaneously with a user-owner base unit and with commercial server-based services.
I'll agree, however, that for some folks, an appliance computer is all they'll ever want, need, and use. The loss of freedom, for them is only theoretical.
I'd also agree with the author that such computers would benefit from "decades of research into human-computer interaction". My only caveat is that multiple vendors, while ultimately best for the consumer, also seem to result in a multitude of somewhat incompatible user interfaces and interaction styles. The solution would be to somehow make the interface portable so that you could use whatever user interface you wanted (on even multiple interfaces on the same device). Apple, who sells their hardware by selling the interface, wouldn't go for that at all.
__________________
HTC HD2 US (unlocked) + 16GB micro SDHC (in holding)
HTC Evo + 16GB micro SDHC (in use)
|
|
|
|
|
02-02-2010, 04:32 PM
|
Sage
Join Date: Jul 2005
Posts: 639
|
|
"Old World" vs. "New World?" Really? The development of new technologies has always followed the same basic path -- development, refinement (specialization to task), miniaturization, mobilization, convergence (and then further rounds of miniaturization and convergence). Use the phone for example: "crank" wall phone-->rotary dial-->touch tone->wireless phone-->mobile phone-->smart phone-->smaller smart phone->smarter smart phones-->etc. The ONLY reason for a completely specialized device is because the technology isn't there yet to make it efficient and cost effective to have that device perform multiple tasks. To ignore this is silly. People don't WANT to buy a billion devices. They will if they have to to fill their "needs," but the most desirable and most efficient thing is to have a converged device.
The iPad class of device will eventually completely obsolete dedicated readers as technology matures, but it will also continue to become more of a multi-purpose, flexible, and power-user friendly class of device (ala the "Old World"). This is inevitable. I'm not 100% sure that his arguments are completely incompatible with this reality, but he seems to come to incorrect conclusions. Believe me, the world is NOT moving towards a million different appliances. It is simply continuing along the path described above.
Yes, we've refined the interface and made things easier, but we ALWAYS go back to flexibility once such a move is possible without completely losing the simplicity. The author is obviously too young to remember the move from text-based to GUI-based computing. We initially gave up power for power-users in favor of more universal ease of use. And yes, there was screaming galore (I for one hated it). But eventually the power was gained back as the interfaces improved and a new breed of power-users evolved into that "new world" and demanded more efficiency. The need for power/flexibility and the desire for simplicity converged. This is no different. And to think of it as an entirely new world is naive at best.
/rant mode off
|
|
|
|
|
02-02-2010, 08:50 PM
|
Pontificator
Join Date: Feb 2002
Posts: 1,043
|
|
The biggest area of concern for me is the shifting of user data from local, isolated storage locations into the 'cloud' of data. I'm fine with web-based applications, that's just another way of doing the same tasks, in many cases increasing efficiency in a number of ways (which we needn't detail as they're common knowledge by now). But cloud-based data storage = danger for individual users. Of course that same danger exists for many, perhaps most users anyway in the local storage scenario of 'old world' computing, considering how few people actually bother to make regular backups in reliable formats (using backup software, alongside separate 'ghosting' of data using simple copy/paste operations) and multiple locations, at least one being off-site. I've suffered enough hardware and software malfunctions to know only too well that backups can never be too frequent or in too many formats, bounded of course by common sense. I'll leave the extremes of backing up to obsessive/compulsive types.
So in this shiny new age of cloud-based data and processing power we are to trust that our admins will maintain flawless backups as we need them, while riding shotgun to protect against myriad sources of attack against both our data and their servers? A certain still recent event involving Sidekick users comes to mind. Rather a significant 'oops' and a cautionary tale which should not fall from general awareness, but seemingly does so anyway.
My reaction to these new devices and the new paradigm is largely hostile, not because I am power user, as really I am not. But the stuff I use it for cannot yet be done 'in the cloud' in large part, and even those things which could be done there I don't want to be done there owing to privacy concerns (I don't want my videos and still images to be shared unless by my intention, but there's abundant evidence no such expectation of privacy is justified when talking about social media sites, so why would cloud computing services be any different?), data loss concerns (if I lose a file, I usually have at least one spare copy on separate media because that's how I work - usually it's more like three distinct locations, five locations for my most important business data), and the simple desire to keep my stuff... as, well, my stuff, not anyone else's to mess with in any way, even if their intentions are good and their practices immaculately careful and thoroughly wise. The sense of being baby-sat is not fun, in my opinion. Some may like it, find it comforting knowing that Apple is maintaining their libraries of music, books, films, eventually iDocuments, whatever. Gives me the iCreeps.
So no thanks to the new world. I keep my old papers in places I can get at them and sort through them. Same with my old books (when I have let them out, a significant portion have not been returned), my CDs (don't want other people scratching them), my socks, whatever I need in my life. When I'm dead, whatever, 'they' can do what they like with my stuff both physical and virtual and I simply won't care. Meantime I want my computer-based information to stay home.
Discussions like this are the most 'out there' form of my personhood in the computer realm. And remember 'back in the day' when Dale Coffing's pocketpcpassion.com was one of the busiest places for guys like us? Remember how RAID failed him and lost something over half a million posts? Remember how, after painfully time consuming and expensive attempts he failed to recover any of it, but rebuilt anyway and people came back and contributed masses of further discussion... which was then all lost in another year or so, in another sort of crash, in spite of much greater (Microsoft-supported) precautions being taken? The 'cloud' back then was mostly in websites like that, filled to brimming with useful and accessible information, and guarded carefully by well-intentioned and well-trained people. And it got gone. What's really changed? Aren't servers still pretty much the same, if not more heavily burdened considering the vast increases in load and the difficulties in scaling hardware and support to meet such escalating demand? How is the cloud going to cope with a billion+ users expecting 100% reliability for their most important data, not just relatively trivial tech discussions such as we mostly see in forums? I see big, big problems ahead for this new world, and public relations disasters for any company providing such services without adequately cautioning users to make local copies of important data.
__________________
Gerard Ivan Samija
|
|
|
|
|
|
|