Log in

View Full Version : Old World vs. New World Computing


Jason Dunn
02-02-2010, 12:00 AM
<div class='os_post_top_link'><a href='http://stevenf.tumblr.com/post/359224392/i-need-to-talk-to-you-about-computers-ive-been' target='_blank'>http://stevenf.tumblr.com/post/3592...puters-ive-been</a><br /><br /></div><p><em>"In the New World, computers are task-centric. We are reading email, browsing the web, playing a game, but not all at once. Applications are sandboxed, then moats dug around the sandboxes, and then barbed wire placed around the moats. As a direct result, New World computers do not need virus scanners, their batteries last longer, and they rarely crash, but their users have lost a degree of freedom. New World computers have unprecedented ease of use, and benefit from decades of research into human-computer interaction. They are immediately understandable, fast, stable, and laser-focused on the 80% of the famous 80/20 rule. Is the New World better than the Old World? Nothing's ever simply black or white."</em></p><p>This is a really great "think piece" that's well worth reading if you're the kind of person that likes to think about where computers - and that includes mobile devices - are going to be moving in the next decade. As such, I'm posting it across all our sites to get the widest possible take on the topic. I want to hear from you! <MORE /></p><p>Steven Frank, the author, posits that new world computers are task-centric and secure, and that's the future of computing. For many types of scenarios, I think that works really well - but Frank doesn't seem to acknowledge that in order to do anything involving real content creation, an "old world" PC is still required. I'm happy to have a limited-in-functionality Web-pad style device next to my couch for Web surfing, tweeting, etc., but when I need to process raw files on a colour calibrated monitor, or edit HD video? Old world computing rules those scenarios.</p><p>I also have to wonder how sophisticated the software can get on these New World computers - ever noticed how so many iPod Touch games are the same? Screen size and touch-only inputs are significantly factors on what developers are able to do.</p><p>In some ways this boils down to the "appliances" vs. "computers" argument that has been going on for years. Appliances are more reliable, but they only do very specific things. Computers do infinitely more, but are generally less reliable than appliances. Which would you rather have, and why?</p>

Macguy59
02-02-2010, 01:02 AM
In real world terms both have their place. I won't have any trouble replacing my unibody MBP with the Apple tablet because I have a 27" iMac to handle the heavy lifting stuff. There is something to be said for a device that only does a few things but does them well without concern (much) for spyware/virus and is rock solid stable. Instant on has been a holy grail for computers for a long time and frankly I don't know why we're not there yet.

crimsonsky
02-02-2010, 01:36 AM
Of course there's room and need for both. But for most non-geeks or casual users, I'd suspect the New World computer model would be best for them. My wife, who is totally not into anything technological, would probably be a good candidate for New World computing.

I could EASILY replace my netbook with an iPad or even my iPhone (except for screen size) since I tend to use it for things that really don't require a lot of keyboard input (browsing, reading RSS feeds, Bible reading, the like). But for writing and photo and video editing, gimme my desktop machine (currently a 2009 Mac Mini). It's going to be a LONG time before they can find a way to make these type of activities work well on a touch screen device.

Hooch Tan
02-02-2010, 02:05 AM
I think that the only reason why the "New World Computing" can exist effectively is because of cloud computing and wireless access. Without persistent connectivity, the single task devices to handle email, remote controls, etc. would be far more limited in scope.

There still needs to be some balancing, as we'll want devices that can do more and more, and then want greater battery life. The iPad offers a great deal of flexibility at a cost of battery life, while the Kindle, much more limited in scope can go for a considerably longer period without charging. I even remember Microsoft's attempt with their Spot watches, which needed to be charged every day, but offered far more than what a typical watch would do. Perhaps it will be a perpetual see-saw.

But back to cloud computing, as interfaces to our data improve, I think that some form of cloud computing, even if it is our own home networks, will be the result of this change. Imagine being able to set up what you want processed for a video, and then have a home server do all the processing from a remote device, instead of being tied to a desktop!

Macguy59
02-02-2010, 02:16 AM
The iPad offers a great deal of flexibility at a cost of battery life, while the Kindle, much more limited in scope can go for a considerably longer period without charging.

We'll need to wait and see how Apples battery life numbers hold up under testing but it seems impressive. What would happen to the Kindle's if you added color ? I do get what you're saying though.

Hooch Tan
02-02-2010, 03:06 AM
We'll need to wait and see how Apples battery life numbers hold up under testing but it seems impressive. What would happen to the Kindle's if you added color ? I do get what you're saying though.

Actually, assuming that the iPad even gets 8 hours with reasonable usage, that's still quite good. I think the Kindle could get colour without much of a sacrifice, as there are colour e-ink displays, but it'd still be the same slow refresh. Steve Jobs does have a point about who would read for that many hours straight. Sure there are some, but I'm the kind of person who charges everything the moment he gets home, so if it were a choice, I'd go for the iPad over the Kindle. Whether other people would make the same choice? I dunno, but the regular charging is worth it in my opinion.

Macguy59
02-02-2010, 04:24 AM
I'd go for the iPad over the Kindle. Whether other people would make the same choice? I dunno, but the regular charging is worth it in my opinion.

I don't know if it's much of threat to the regular Kindle but the Kindle DX should be looking over it's chassis ;)

Russ Smith
02-02-2010, 02:02 PM
The keywords in the intro quote were "but not all at once" for me. Why is that? It strikes me as it's a backward engineered bow to the iPAD, since it can only do one thing at once (at least for now). The thing is, I like to be able to do at least two things at once a lot of the time. Reading e-mail while watching a video; checking Facebook while waiting for a download; there are a lot times when one task doesn't require constant monitoring. Even on my phone, I want to be able to look up an address while talking to the person that needs it.

There are some NAS boxes that also support Torrents and such, allowing you to start a process and have it continue without being "connected" to it. That may be part of the solution for real usefulness in New World computers.

"New World computers do not need virus scanners, their batteries last longer, and they rarely crash." seems like wishful thinking to me or at least predicated on the "but not all at once" scenario. It seems to me that, as long as you have clever, but morally-deficient people, you'll have to deal with some form of mal-ware. The sandbox and single execution only make it harder.

I do think that we'll eventually move to a dual device mode where the device we carry will have built-in phone capabilities but rely on a client for remote access/display to a much more powerful base device for bigger applications. Obviously, that has to be predicated on fast, nearly faultless, ubiquitous wireless access in order to work. It also requires some major work on the interface problem. Right now the screens are either too bit (iPAD) or too small (most smartphones) to be both carry-able and usable for more than basic needs. (There's only so much resizing and scrolling that's tolerable.) The same is true of keyboards.

The handheld device in the last paragraph could be an appliance computer of some sort (though I still think it needs to multi-task a bit).

The thing about an open client like that is that you could use it equally well and simultaneously with a user-owner base unit and with commercial server-based services.

I'll agree, however, that for some folks, an appliance computer is all they'll ever want, need, and use. The loss of freedom, for them is only theoretical.

I'd also agree with the author that such computers would benefit from "decades of research into human-computer interaction". My only caveat is that multiple vendors, while ultimately best for the consumer, also seem to result in a multitude of somewhat incompatible user interfaces and interaction styles. The solution would be to somehow make the interface portable so that you could use whatever user interface you wanted (on even multiple interfaces on the same device). Apple, who sells their hardware by selling the interface, wouldn't go for that at all.

ptyork
02-02-2010, 04:32 PM
"Old World" vs. "New World?" Really? The development of new technologies has always followed the same basic path -- development, refinement (specialization to task), miniaturization, mobilization, convergence (and then further rounds of miniaturization and convergence). Use the phone for example: "crank" wall phone-->rotary dial-->touch tone->wireless phone-->mobile phone-->smart phone-->smaller smart phone->smarter smart phones-->etc. The ONLY reason for a completely specialized device is because the technology isn't there yet to make it efficient and cost effective to have that device perform multiple tasks. To ignore this is silly. People don't WANT to buy a billion devices. They will if they have to to fill their "needs," but the most desirable and most efficient thing is to have a converged device.

The iPad class of device will eventually completely obsolete dedicated readers as technology matures, but it will also continue to become more of a multi-purpose, flexible, and power-user friendly class of device (ala the "Old World"). This is inevitable. I'm not 100% sure that his arguments are completely incompatible with this reality, but he seems to come to incorrect conclusions. Believe me, the world is NOT moving towards a million different appliances. It is simply continuing along the path described above.

Yes, we've refined the interface and made things easier, but we ALWAYS go back to flexibility once such a move is possible without completely losing the simplicity. The author is obviously too young to remember the move from text-based to GUI-based computing. We initially gave up power for power-users in favor of more universal ease of use. And yes, there was screaming galore (I for one hated it). But eventually the power was gained back as the interfaces improved and a new breed of power-users evolved into that "new world" and demanded more efficiency. The need for power/flexibility and the desire for simplicity converged. This is no different. And to think of it as an entirely new world is naive at best.

/rant mode off :)

Gerard
02-02-2010, 08:50 PM
The biggest area of concern for me is the shifting of user data from local, isolated storage locations into the 'cloud' of data. I'm fine with web-based applications, that's just another way of doing the same tasks, in many cases increasing efficiency in a number of ways (which we needn't detail as they're common knowledge by now). But cloud-based data storage = danger for individual users. Of course that same danger exists for many, perhaps most users anyway in the local storage scenario of 'old world' computing, considering how few people actually bother to make regular backups in reliable formats (using backup software, alongside separate 'ghosting' of data using simple copy/paste operations) and multiple locations, at least one being off-site. I've suffered enough hardware and software malfunctions to know only too well that backups can never be too frequent or in too many formats, bounded of course by common sense. I'll leave the extremes of backing up to obsessive/compulsive types.

So in this shiny new age of cloud-based data and processing power we are to trust that our admins will maintain flawless backups as we need them, while riding shotgun to protect against myriad sources of attack against both our data and their servers? A certain still recent event involving Sidekick users comes to mind. Rather a significant 'oops' and a cautionary tale which should not fall from general awareness, but seemingly does so anyway.

My reaction to these new devices and the new paradigm is largely hostile, not because I am power user, as really I am not. But the stuff I use it for cannot yet be done 'in the cloud' in large part, and even those things which could be done there I don't want to be done there owing to privacy concerns (I don't want my videos and still images to be shared unless by my intention, but there's abundant evidence no such expectation of privacy is justified when talking about social media sites, so why would cloud computing services be any different?), data loss concerns (if I lose a file, I usually have at least one spare copy on separate media because that's how I work - usually it's more like three distinct locations, five locations for my most important business data), and the simple desire to keep my stuff... as, well, my stuff, not anyone else's to mess with in any way, even if their intentions are good and their practices immaculately careful and thoroughly wise. The sense of being baby-sat is not fun, in my opinion. Some may like it, find it comforting knowing that Apple is maintaining their libraries of music, books, films, eventually iDocuments, whatever. Gives me the iCreeps.

So no thanks to the new world. I keep my old papers in places I can get at them and sort through them. Same with my old books (when I have let them out, a significant portion have not been returned), my CDs (don't want other people scratching them), my socks, whatever I need in my life. When I'm dead, whatever, 'they' can do what they like with my stuff both physical and virtual and I simply won't care. Meantime I want my computer-based information to stay home.

Discussions like this are the most 'out there' form of my personhood in the computer realm. And remember 'back in the day' when Dale Coffing's pocketpcpassion.com was one of the busiest places for guys like us? Remember how RAID failed him and lost something over half a million posts? Remember how, after painfully time consuming and expensive attempts he failed to recover any of it, but rebuilt anyway and people came back and contributed masses of further discussion... which was then all lost in another year or so, in another sort of crash, in spite of much greater (Microsoft-supported) precautions being taken? The 'cloud' back then was mostly in websites like that, filled to brimming with useful and accessible information, and guarded carefully by well-intentioned and well-trained people. And it got gone. What's really changed? Aren't servers still pretty much the same, if not more heavily burdened considering the vast increases in load and the difficulties in scaling hardware and support to meet such escalating demand? How is the cloud going to cope with a billion+ users expecting 100% reliability for their most important data, not just relatively trivial tech discussions such as we mostly see in forums? I see big, big problems ahead for this new world, and public relations disasters for any company providing such services without adequately cautioning users to make local copies of important data.

Hooch Tan
02-02-2010, 09:12 PM
But the stuff I use it for cannot yet be done 'in the cloud' in large part, and even those things which could be done there I don't want to be done there owing to privacy concerns (I don't want my videos and still images to be shared unless by my intention, but there's abundant evidence no such expectation of privacy is justified when talking about social media sites, so why would cloud computing services be any different?), data loss concerns (if I lose a file, I usually have at least one spare copy on separate media because that's how I work - usually it's more like three distinct locations, five locations for my most important business data), and the simple desire to keep my stuff...

Hear hear! I'm all for owning ones own data, and I'd like to think I have most of the know how to do so. But most people apparently don't know, care, or want to spend the time in managing their own information, and are willing to give up a lot of privacy for that convenience. At least until they lose their data. Every so often, I hear complaints about losing access to their hotmail, gmail, even data on a CD or USB drive that has "expired". And questions about backups come up with a blank stare. *sigh*


Discussions like this are the most 'out there' form of my personhood in the computer realm. And remember 'back in the day' when Dale Coffing's pocketpcpassion.com was one of the busiest places for guys like us? Remember how RAID failed him and lost something over half a million posts? Remember how, after painfully time consuming and expensive attempts he failed to recover any of it, but rebuilt anyway and people came back and contributed masses of further discussion... which was then all lost in another year or so, in another sort of crash, in spite of much greater (Microsoft-supported) precautions being taken?

I don't recall the details surrounding the second crash, but to be fair, a RAID (RAID 1, 5, 1+0, etc) array is only intended to protect against hardware failure, not data corruption, which is what I believe happened in the first crash.

With regards to reliability, I believe it comes down cost, convenience and effort. No device is 100% reliable, so there's redundancy and backups. So you set up at least one backup. But how often do you test your backups? Admittedly, I don't, and I know I should. I've read horror stories where companies will have made the effort and extra cost for a backup system, but then never check to make sure it is working properly on a regular basis (which would cost them extra time and money as well) and then find out that they've been making blank backups for 6 months! At the consumer level, I just do not think most people are willing to go through all the necessary hoops to ensure that they don't lose one bit of their data. And they are not willing to pay the cost, even with cloud computing, that would require corporate or military spec reliability.

Gerard
02-02-2010, 10:04 PM
I don't recall the details surrounding the second crash, but to be fair, a RAID (RAID 1, 5, 1+0, etc) array is only intended to protect against hardware failure, not data corruption, which is what I believe happened in the first crash.
I don't know the exact details either, but yes, the initial failure was in large part due to treating RAID as though it were a backup solution. When both drives failed (in the same box I think), all was lost. The causes of the second crash remained, I think, a mystery to outsiders. Dale kind of disappeared after that, though he kept going to trade shows for some years.
...But how often do you test your backups? Admittedly, I don't, and I know I should. I've read horror stories where companies will have made the effort and extra cost for a backup system, but then never check to make sure it is working properly on a regular basis (which would cost them extra time and money as well) and then find out that they've been making blank backups for 6 months! At the consumer level, I just do not think most people are willing to go through all the necessary hoops to ensure that they don't lose one bit of their data. And they are not willing to pay the cost, even with cloud computing, that would require corporate or military spec reliability.

Sadly, most computer users will one day experience some level of personal and/or business data loss due to hardware or software failures, or user error. Not much we can do about people deciding to walk in front of that particular bus, without becoming the same sort of babysitter Microsoft and Apple are in their different ways.

But I do mentioned it to people when it seems relevant, when I know they have data which is important to them and it seems they have not considered making at least one backup copy. And as I mentioned above, as I frequently say in such conversations both in and out of forums, entrusting one's information to the vagaries of a backup program (usually left with more or less default settings by most users, which usually means some sort of proprietary compression, subject to rendering data irrecoverable in cases of corruption) is foolishness. Hence the manual copy recommendation. It's slow, and needs a minute or two worth of thought, but really not very challenging and getting easier year by year as USB and wireless get faster as do drives, and as bigger drives become more affordable.

I know lots of people who keep all their family images and videos on one computer only, and make no backups of any sort. A half-terabyte external notebook drive now costs under $100. There is no practical excuse for such risks any more, when multiple copies of all our data can be put into such small drives, when it's practical to carry 500GB (okay, 460GB or so) in one's pocket most of the time without really noticing it (my Hitachi is really small and sleek), while leaving other copies at home and perhaps at a relative's house.

I 'test' my backups by going into them (always un-compressed - I've seen too many corrupt ZIP and RAR archives to risk using them any more) and opening a few dozen random files every few weeks or so, and I always keep at least the last two backups on any given media. Some media spin, but for my most important information I also use solid state cards. Don't bother with optical as I've seen too many bad writes there, but maybe just my bad luck with hardware. I could probably test more often, but so far GenieSoft Backup and manual copies have proven pretty reliable on my PCs (and my daughter's and my wife's) and Resco Backup and PIMBackup and manual copies the same on my phone.