The “cloud” is a relatively new term floated about by computer scientists and product marketers alike, but is a concept both bold and just over the horizon for all of us who own a computer, tablet, smartphone or gaming device.
Cloud computing can best be described by using your imagination...imagine that your computer, i-device, or any of your gadgets that hold data and software applications - no longer does! In cloud computing, all of that is stored somewhere mysterious, somewhere far away, and in most cases, on a server farm in rural North America.
Your computing device then becomes nothing more then a simple web browser, much like the new Google Chromebook, which is a netbook sans what you would traditionally call software packages: programs that need installing, updating, and tender loving software-care. All that you generate and ingest (photos, movies, documents, spreadsheets, and more) are not homed on this revolutionary new device, but off in the “cloud” instead.
While computer scientists have been talking about this for decades, its just recently that cloud computing has become a reality – and many of you are already doing it without even knowing about it. Take for example Gmail and Hotmail, two prime examples of the first mainlined cloud applications, where all your data – as well as the interface into that data - is stored online and accessed via a simple web browser of your choosing, and at a location of your choosing as well (home, work, cybercafé, etc.)
“Every cloud has its silver lining but it is sometimes a little difficult to get it to the mint.”
— Don Marquis, American Poet, 1901
But a lot more “clouds” are about to be mainstreamed...take for example Apple’s new iCloud offering, announced and due out this fall. In this brave new Apple world, items purchased from the online iTunes store, stay online – in your account – and are instantly synchronized with every other Apple device that you may own. For example, your iMac, Macbook Air, iPad2, or even your lowly iPod Nano is simultaneously and instantly filled with your purchases or up-loaded additions. No more mp3 files to store or accidently delete, and if you buy a new device, no problem, your music instantly appears there as soon as you sign in to iTunes in the cloud.
Another cloud example from software giant Microsoft is starting to rain down now on the masses as well, with both Xbox 360 and Office 365 being rolled out soon.
In both cases, applications (Xbox Titles, Word, Excel, PowerPoint, etc.) as well as the data that goes along them (your high scores, Word files, Presentations, Spreadsheets, etc.) are all stored in one of Microsoft’s huge cumulous clouds, where not only you have access, but your friends, family, co-workers, and fellow gamers do as well.
But there are many critics of this new direction in computing, and they all forecast doom and gloom. Just a few short years ago, Larry Ellison (founder of Oracle) called cloud computing "fashion-driven" and "complete gibberish". Yet just last year Oracle released it’s highly acclaimed Exalogic machine, a high-powered "cloud in a box,” which is reported to be selling like hotcakes today.
GNU founder and tech author Richard Stallman calls cloud computing “a trap,” and "Stupidity... It's worse than stupidity: it's a marketing hype campaign." But then again, Stallman also believes that the US Government is encouraging the use of cloud computing because this allows them to access your data without needing a search warrant.
But regardless of your tin-hat conspiracy leanings, criticism of cloud computing boils down to two valid points: 1) you no have physical control of your media, whether that be a game or a shopping list, and 2) you are totally dependent on Internet access for all that you put in the cloud.
For us in Nepal, dependence on service providers like NTC and World Link is risky, as we could be in a world of hurt if when we need our monthly budget reports or photos to send to grandma, that we can’t get to them because of network congestion or “sun spots” - as often cited by World Link as a reason their Internet service goes down.
But there is no denying that the forecast for the future is indeed full of clouds, coming over the horizon swiftly in one form or another, from complete devices dependent on them (Google’s Chromebook) to new services like iCloud and Xbox 360, where your “most precious” is no longer on earth next to you, but instead, flowing to and from the cloud.
Created in 2003 by Andy Rubin, an ex-Apple engineer and now Google engineer, the now Google-owned Android operating system is being activated in both mobile phones and tablets at an alarming rate: hundreds of thousands per day, seven days a week. And analysts tell us today that the sale of Android-based smartphones is outpacing the #1 phone in the world, the iPhone4. Folks that shift through such numbers for a living also say that Android phones now dominate the market worldwide, leaving Nokia, Blackberry, and even Apple in the proverbial dust with a 46% global market share in Q2 2011.
But just what is Android, and why should you care if this invasion is gripping the planet?
First, like Apple's iOS, Android is what runs all the applications and functions on any Android-equipped smartphone. It is very much like the operating system running your home or business computer, such as Windows 7 or OSX Lion. Without such an operating system, your smartphone would instead be "stupid," as in the olden days of simple feature phones.
Today's smartphones and tablets are running processors that rival the one in your netbook, and able to display high-definition video as well browse web pages - in addition to taking the occasional phone call. So a decent operating system is now required: one that supports multimedia, multi-touch screens, multi-tasking, GPS, accelerometers, gyroscopes, proximity sensors, and even thermometers.
And the reason why Android has to support all this, is that developers are cranking out applications that utilize these features at an alarming rate: with over 250,000 applications in the online Android Market and over 4.5 billion downloads to date. These applications range from simple kids games to sophisticated scientific tools to savvy social media aids. And for the money-minded, over 57% of all applications for Androids are free, vs. the 27% number of free apps available on the iTunes store for Apple's iPhone.
Which brings us the ultimate question on everyone's mind these days when shopping for a new smartphone: Apple or Android, which should I choose? Let's just take the top contender from both camps (at the moment): the iPhone4 from Apple and the Galaxy S II from Samsung. The SII is considered to be the elusive iPhone-killer people have been yakking about since the iPhone became the leader in smartphone design and function.
In Nepal, the SII sells for up to Nrs. 15,000 less then a comparable iPhone4, and sports a bigger touchscreen, twice as many cameras (2) and can play Flash web content, unlike the Apple contender.
The SII is also running a 1.2GHz ARMv7 dual-core processor, allowing Gingerbread (the code name for the latest Android OS) to run at blazing speeds, and feels much more zippy then Apple's iOS4 when doing like things, like gesturing around screens and booting up applications. But worrying too much about technical specs is an exercise in futility when it comes to smartphones, for as soon as you buy one model - there is another waiting at the gate...
But worrying too much about technical specs is an exercise in futility when it comes to smartphones, for as soon as you buy one model - there is another waiting at the gate... Ease of use is also a factor, and here is where the Android OS shines. No need for an in-between syncing program like iTunes to add music and other data to your smartphone from your PC. Just drag and drop, and that's all there is to it.
Both the SII and the iPhone are built rock solid, employing the hardy Gorilla Glass fronts, but Samsung has opted for a plastic textured back, instead of the iPhone's design of molded aluminum.
But like all smartphones on the market today, nothing, not even the Android OS, can save you from charging your device nightly. Both the SII and iPhone4 must be plugged in after a heavy day of use, as the battery can't go another day without a decent charge. For ex-feature phone users, this may take some getting used to, where your old phone could perhaps go for a week without plugging in.
Sticker shock may also befall the once-feature phone user, as the Android-powered Galaxy S II runs about Nrs. 52,000 in the capital, and is rarely in stock. However, there are many other smartphones running Android OS for just a bit less then that. Rupee-pinching consumers can also experience the Android invasion on any of these models of Android-powered phones: HTC Dream or Desire, Nexus One or S, or any of the Motorola Droids.
But worrying too much about technical specs is an exercise in futility when it comes to smartphones, for as soon as you buy one model, there is another waiting at the gate to come out that is thinner, faster, smarter. This is true no matter what make or model you are looking at. For example, look for the iPhone5 out later this year, as well as an upgraded Samsung Galaxy S II made to compete.
But already left in the wake of war is Nokia, which has seen a global market decline as rapid as the fall of the Soviet Union back in the day.
Apple has now eclipsed the once world-leader Nokia in the mobile phone market in shear volume of units sold (June 2011). Other once super phone powers such as Blackberry and Motorola are also falling behind in the race to be the top seller of these mobile devices that are changing our lives, and the way we view mobile computing.
There was a time when the iPhone was considered a gadget; a toy manufactured by a computer company that supposedly knew nothing about the mobile phone market. Now, the iPhone is the benchmark that all other manufactures must meet to defeat. In fact, a recent release of Samsung’s Galaxy S II, which takes most of the goodness of the iPhone and goes even better, is now held up in Australian patent courts, unable to move into the market until the court case with Apple is resolved. It’s just too much like the iPhone for Apple’s liking.
So what is it that all others have to best, in order to win this arms, fingers and minds race with the now megalith Apple Corp? Here is what Motorola, Nokia, HTC, Blackberry and others have to focus on to catch up:
In the battle to be the best smartphone, design is paramount, and one can’t overlook the “looking cool” factor that Apple products ooze in general. However, for the young texting crowd (especially in Asia) Blackberry is making inroads with its colorful hang-around-your-neck Bold and Curve models. But announced this month is something even more exciting from this once business-phone giant: the BlackBerry Torch 9850/60, which by no surprise, looks like the iPhone, but has ditched the capable BlackBerry hardware keyboard, for a full-touch onscreen keyboard experience.
While most folk’s eyes glaze over when anyone starts talking smartphone specs, hardware is one aspect that sets mobile handset providers apart. Just as with all of our gadgets, the number of pixels that can be displayed or captured, the speed in which the processor processes, etc., are all factors that can influence the ultimate user experience.
But on these specs, it should be noted that Apple has always been a master at using inferior hardware specs to their advantage, i.e. taking parts that perform less on paper, and have them then outperform in the user’s mind. For example, there is no USB port or extra SIM slot on an iPhone, and it uses a fairly mundane processor to drive the user experience, but as sales show, this has not been a negative factor. In fact, previous to the iPhone4, the handset only had one camera while the competition always included 2, one on the front and one on the rear.
However, the competition is catching up hardware wise, and taking the Samsung Galaxy S II in hand, one can really feel the difference when using its dual-core processor over the one used in the current iPhone: zippy!
Software has always been the iPhone’s strength on the battlefield. iOS 4 is probably the most fluid and effective smartphone OS to date, and allows multi-tasking and a Retina-class with resolution independent display that makes watching movies, reading books, and playing 2D and 3D games just a sight to behold.
However, the new Gingerbread OS from Google (used in the HTC Sensation and others) is an up-and-coming contender, and as mentioned, is making the Galaxy S II a true iPhone rival.
Yet Apple holds top market share on the shear number of apps available to load on a smartphone, with over 500,000 apps available via the iTunes Store and with over 15 billion downloads.
The only other combatant to even come close is Goggle Android, with 250,000 applications in the online Android Market and with over 4.5 billion downloads to date. Everyone else, as with Windows Phone 7, is left literally in the dust, or flat out dead on arrival. This is certainly the case for Nokia’s Symbian operating system, which at one time enjoyed a 78% market share (2003) but will soon be down to about zero in 2012, as no new phones are expected to use it.
So as we can see from above: design, hardware and software are the three weapons that each mobile phone leader has in their arsenal to win over the world of smartphone users. Today, global domination is clearly in the Apple iPhone camp, but with the rise of Android powered phones, manufactures like Samsung and HTC that deploy Android Gingerbread and beyond, still have a place in the race.
First published in ECS Living, Issue 57, Sept - Oct 2011, p.54
But I've noticed of late that these outbursts of frenzied yelping every time some software or popular website makes a change are not isolated or rare, but are instead fast becoming the norm. So I'm wondering why...
Why did tens of thousands of Mac users flock to the Apple website to vent profusely over the release of OSX Lion, when this OS upgrade price dropped by 60% and could now be downloaded online? Why also did thousands of Final Cut Pro users do the same when an upgrade costing hundreds of USD less hit the App Store?
It seems that computer rage has taken to the internet, and is starting to infect such social media hotspots as FB, blogs, and Twitter, even to the point of getting hotheads ejected from public spaces. An not just virtual spaces, where posting in all caps and using banned language can get you blacklisted, but your rage can get you in trouble in the real world as well. Take for example the case of Alison Matsu...
Miss Matsu was recently having drinks in downtown bar in America when she tweeted "the bartender is a twerp" with the hashtag #jackoff to punctuate her feelings. A few moments later, she was asked to leave the establishment by the restaurant manager in charge.
So here we have computer rage coming full circle in a new dimension; no longer do we just pound on the keyboard violently when we lose two hours of homework when our laptop goes bluescreen, but we also use social media to let everyone know that we are pissed blue in the face – AND - that same social media is coming back to bite us, perhaps when our boss reads our post and decides to let us go, or our spouse realizes that without anger management classes, we are no longer fit to be with.
I find this trend a bit disturbing; how about you?
It's not news that computers and the software that makes them tolerable is frustrating to use, but it does seem that devices designed to make out lives easier and more enjoyable, are doing anything but...
Consider Netflix's recent loss of 1 million subscribers, when a slight increase in the rental rate drove 1/10 of the company's user base away overnight. Here was rage gone viral. And when Reed Hastings, the company's CEO, tried to apologize for the change, he drove tens of thousands more customers out the virtual door.
The lesson learned here is, if you are a website owner, be careful about making any change to your online business, as users are fickle beasts and need to be treated accordingly. And if you are a just the average Facebook Jane or Joe, be careful what you post there as well...
Take the recent case of Jeson Senador, now facing animal cruelty charges from the Philippine Animal Welfare Society (PAWS) for posting a pic of his puppy hanging on a clothesline after getting a bath. What was intended as a joke is looking to turn into a fine or jail time.
The power of users to incite social change as well as business change cannot be denied, as puppy abusers to power abusers are being pummeled and toppled around the globe, as seen this Arab Spring, where dictators fell like dominoes - in part by the ire expressed on social media sites. Even the mighty Apple Corporation abdicated that recent change to its Final Cut Pro upgrade decision mentioned earlier, and conceded to continued shipping the old version along with the new.
So in this light, I am recommending that there be a Gross National Happiness User poll, to find out how happy we are as computer users these days, and then perhaps a development project can be started to improve the situation as part of the UN's Millennium Development Goals.
But for those that can't wait, there are online companies that offer online anger management classes with court approved therapy (www.angermanagementonline.com). And then there is always the e-book "Stop Anger, Be Happy" by Kathy Garber that can be order on Amazon.com.
The questions is (in a nut-case-shell), are we becoming increasingly unhappy using computers as they become more and more critical in our daily lives, or is it that our global connection that is making it easier to spread this unhappiness around the world? I'm not sure, so you tell me.
First published in "The Week" on Sept. 23 2011
Forty years ago (almost to the day) I used my first computer. It did not have a monitor or a keyboard that I was allowed to touch, and it was housed behind a counter in a room that I was not allowed to enter. The IBM/360 was a monster of a device, and filled a small room with toggles, blinking lights, and large pushbutton switches ala the bridge of the Starship Enterprise in a 1966 Star Trek episode.
To operate the computer, I sat at a keypunch machine (resembling a small piano) and typed in what I wanted the computer to do. This ½ day activity produced a stack of keypunch cards that I handed to someone behind the counter, who would then return to me a printout of the results the next day. In this case, it was my homework.
Computing back then, to say the least, was not much fun.
But 30 years ago, almost to this day, I got my first personal desktop, the IBM PC XT. It was large by today’s standards, had a keyboard and monitor – albeit it was a phosphorous green-on-black display with no resolution to speak of, and had a floppy drive for disks that resembled CDs, only floppier. But it was my own, and I could do what I wanted with it, when I wanted. That was when the fun really began.
My fun however was limited to an 8MB harddrive and 128kB of memory. That’s not a typo, that’s about 1000 times less then I have sitting on my desk today. It ran PC-DOS as an operating system, and I could program in BASIC and print out the words “Hello World” on the screen. Woo-hoo! Now we were really having a ball. But I did use this computer to type up school papers and work reports in WordPerfect, and even print them out on a dot-matrix printer that sounded more like a ripsaw when printing then anything else.
Then about 20 years ago to this very day, things really started to get exciting. I was working @ IBM with hypertext (the precursor to HTML) to produce online help files and just beginning to write web pages for what was then to be the new World Wide Web. This was indeed interesting, as everything was done by hand inside a crude text editor, and then rendered out to see the results. What HTML did for us was to eliminate the rendering (waiting), and we finally had instant publishing across a loose network of computers all over the world.
This rocked the socks off for everyone involved, and the results of our R&D in the late 80’s can be seen on any YouTube or Facebook page today. In short, online publishing had reached the moon.
But Mars and the rest of the cosmos were to come next for both media publishing and global communication. I just think it interesting to note it was all started in backroom cubicles with folks who loved to dink with computer code and hardware bits. We would cable up monstrosities of circuit boards and then program all sorts of crazy algorithms just to make “Hello World” come alive with color, animation, video, sound, and eventually touch and vibration.
Now on the eve of retiring from all things based in chips and pixels, I have say that the folks who brought all this to us (personal computing) had the spirit of NASA, whose immortal words on the flight deck of Apollo 13 also rang true in our IT circles: Failure was not an option.
We wanted to see computers in the hands of everyone on the planet, and did not want our children to ever have to do something as painful as punching out a deck of cards and then waiting a day to get a printout of their homework. We wanted better for the next generation to come.
Considering that my wife just got a smartphone for less then $100 that can surf the web and play Angry Birds, I think we actually succeeded. What say you?
But I am not sure what that would be... more on the forgotten war in Afghanistan? Gadhafi? Syria? Nepal’s new leader?
I think I’ll go with the latter, as I was struck by what Nepali farmer Kedar Thapa said in regards to the new prime minister on newpalnews.com:
For me, it is not important whether the new Prime Minister will conclude the peace process and constitution drafting in a given time or not. I only wish the government lasts longer so that the situation is stable.
The wisdom of earthy folk always impresses me when their truisms make their way into the media. People of the land want stability, regardless of the probability that stability will ever occur. People of the “air” could care less about being grounded, as for them, chaos reigns supreme in the lofty atmospheres in which they travel. And by ungrounded people, I mean westerners and the western media.
Hurricane Irene coverage is the perfect illustration of this western desire to remain unhinged in all aspects of daily life. Excitement is stirred to fever pitch whenever a weather system moves over warmer waters and produces images from space that resemble scenes from the movie “The Day After Tomorrow,” and that also promises to provide 24x7 media coverage of rising tides and swinging traffic signals.
When I was a kid growing up in NYC we discussed the weather as it happened, and we did not have the luxury of tracking storms via NOAA printouts or tweets from FEMA. We simply looked outside the window and said to each other, “Hey look, it’s raining today.”
But today’s New Yorker will instead rush to the store based on CNN forecasts to stock up on wine and cheese and then fret over how many charged batteries they have ready for their $50 Maglites. It is all about preparing for every minor inconvenience, like wet feet or wind-blown hair.
The cost to clean up after Hurricane Irene is estimated to be $7 to $10 billion, and cost over 40 people their lives. Compare this storm with one that hit closer to home: in 1970 Cyclone Bhola killed over 500,000 people in two countries, cost an estimated $185 million to clean up (about $1 billion in today’s dollars), and is attributed to helping Bangladesh succeed from Pakistan one year later in 1971.
Now that was one hurricane to go on and on about...
Looking at recent earthquakes in the same way also comes up with some interesting numbers; take for example Washington D.C.’s recent earthquake, and compare that with Pakistan’s last major one back in 2005. In D.C., the cost is estimated at no dead and $1 billion (although it is not clear on what that will be spent on outside of the 4” crack at the top of the Washington Monument), and the cost for Pakistan was $5.4 billion with 79,000 dead around the region.
Now of course I am crap at making sense from statistics such as this, but I can only conclude from the numbers that 1) when disaster strikes, it hits us harder then those on the east coast of America, and that 2) more money is spent on cleanup per number dead in America then anywhere else on the planet.
While my summary is news not-worthy to print, we all know this premise to be true: that Americans are high maintenance when compared to Asians, and that Asian living structures collapse at the drop of a hat.
However, what I do see as a valuable take-away is that Asians stand to lose the most, but seem concerned in the least. A quick look at any structure going up in your neighborhood will confirm, as will any look at GoN’s disaster preparedness plan for the next natural disaster likely to blow in or crackup from below.
We stand with farmer Kedar Thapa in this regard – wishing for a bit of stability in an ever-stormy and crumbling world.
This article first published in Myrepublica.com on 9/3/2011
Who the heck is he?
- Jiggy Gaton
- lives in Kathmandu and is an aging technologist - has been since the days of Woodstock - so in the words of Roland The Gunslinger "he is from a world now gone by." However, Jigs is extremely up-to-date on all things tech and is also available for hire.