NYT: The Future of Computing

Here’s a nice post from the New York Times about big data, speed, and the future of computing.  It talks a little bit about the technology that makes IBM’s Watson computer so fantastic at beating Jeopardy! champions (we first wrote about this last year…), and that the need for speed will likely change the computer architectures themselves.

There will likely be groundbreaking changes in hardware and software, where computation and decision-making will both become part of the same technology.  This could be where much of the analytics engineering advances come from over the next decade.  Read more from the NYT post here

Siri and the Robot Hall of Fame

In the latest edition of Newsweek magazine, they have a cool set of the greatest robots of all time, in honor of Apple’s introduction of Siri, the technology that will let you use your new iPhone 4S by voice commands.

Here’s a link to Newsweek’s robot montage – but I’ve listed their picks below (which I can’t argue with!…)

1968 – HAL 9000 – The heuristically programmed algorithmic computer wreaked havoc in Stanley Kubrick’s 2001: A Space Odyssey.  One of my colleagues at work told me that he heard that HAL’s soft male (and now creepy) voice is the reason why all computer voices, from navigation systems to Siri, are female.  I can’t blame them!…

1977 – R2-D2 – The astromech droid was a treasured hero in Star Wars, rolling around with C-3PO and helping Luke Skywalker find the Force.  Star Wars is one of my favorite movies of all time, and I love the whole Star Wars universe (yes, even the prequels…)

1984 – The Terminator – Humanity survived the assault of Skynet’s rampaging artificial-intelligence machine. The state of California was not so lucky.   I had a recent conversation about the iPhone’s new Siri application (about how Siri could still command the iPhone even if the phone was locked), and we eventually got around to connecting this back to Skynet!…

1997 – Deep Blue – IBM’s chess-playing computer beat world champion Garry Kasparov, who accused it of cheating and demanded a rematch.  Tough day for humans who want to win at chess…

2000 – ASIMO – Honda’s 120-pound humanoid robot can recognize faces, follow the movement of objects in its field of vision, and run 3.7mph.  My daugther totally digs ASIMO – I’ve seen it about 3 or 4 times at Disneyland.  There’s kind of a sad video of ASIMO falling down trying to climb stairs – makes you really feel for the guy!…

2008 – WALL-E – Pixar’s beloved Hello, Dolly!–watching garbage collector has raked in more than $535 million in worldwide grosses.  Amazing that Steve Jobs had a role in this great robot as well, being the man who helped make Pixar the company that it is today.

2011 – Watson – The IBM supercomputer trounced two Jeopardy! champions this year, and can access 200 million pages of content in mere seconds.  This could be foundational technology for how computers solve challenging problems – pretty amazing stuff!…

2011:  Siri – Apple sold more than 4 million iPhone 4S units in three days on the strength of this voice-activated personal-assistant bot.  Siri could change the way we watch TV as well

Apple Rumored to Take Over TV

GigaOM and the New York Times have reports on Apple’s plans to reinvent the way we watch TV. 

Two days ago, GigaOM reported on the iTunes boss at Apple moving over to head up Apple’s television plans.  Today, they rebroadcast an article by NYT’s Nick Bilton that Apple is planning to launch a Siri-enabled television in 2013.

For those of you who have been vacationing in seclusion over the past month, Siri is part of Apple’s newest iPhone offering, allowing you to operate your iPhone with voice commands.  According to these reports, you’ll soon be able to just talk to your TV set, and it will respond to your commands. 

(Hopefully it’ll be easier than programming VCRs…)

Computerworld on Really Big Data

Computerworld recently posted an article describing the challenges of “big data”, or what they see as “really big data”.  They’ve posted on these challenges before, and it’s interesting to see where they think the big data challenges lie:

“We’ve all heard the predictions: By 2020, the quantity of electronically stored data will reach 35 trillion gigabytes, a forty-four-fold increase from 2009. We had already reached 1.2 million petabytes, or 1.2 zettabytes, by the end of 2010, according to IDC. That’s enough data to fill a stack of DVDs reaching from the Earth to the moon and back — about 240,000 miles each way.

For alarmists, this is an ominous data storage doomsday forecast. For opportunists, it’s an information gold mine whose riches will be increasingly easy to excavate as technology advances.”

Right now, most companies in the “big data” space seem to be focusing on two things: (1) handling, storing, and moving data, and (2) building tools to visualize and analyze data.  With the explosion of data that is generated, having the technology to store it and move it around is obvious.  Also, since there is a lot of data, tools can be easily developed to grab the low hanging fruit that “big data” can provide.

These technology pushes are absolutely important, and necessary to even get at the easy problems that can be solved with mountains of data.  However, as the problems get harder, there will come a need for engineering the right analytics.  Data science and a rigourous engineering discipline can really push the envelope on what can be done in this “big data” space…

More on Computerworld’s take on the data mountain challenge can be found here

Top Twelve Lessons from Steve Jobs

Steve Jobs and Guy Kawasaki

Guy Kawasaki has written a post on the top lessons he learned from Apple’s co-founder Steve Jobs, who passed away last week.

Why should we pay attention to Guy’s lessons? Well, Guy was the chief evangelist for Apple, working directly with Jobs in the early days of the PC, and was critical in cultivating the market for their first Macintosh computer in the 1980’s.  He knew firsthand about the genius (and perfectionism) of Steve Jobs, and has been witness to Apple’s successes and failures since. 

He went on to become co-founder of Alltop.com, an “online magazine rack” of popular topics on the web, and a founding partner at Garage Technology Ventures.  Most recently, he’s released his book Enchantment:  The Art of Changing Hearts, Minds, and Actions. 

I think Guy is right on for these Jobs lessons – here’s one that I thought was particularly good:

6)  You can’t go wrong with big graphics and big fonts.

Take a look at Steve’s slides. The font is sixty points. There’s usually one big screenshot or graphic. Look at other tech speaker’s slides—even the ones who have seen Steve in action. The font is eight points, and there are no graphics. So many people say that Steve was the world’s greatest product introduction guy..don’t you wonder why more people don’t copy his style?”

I’ve seen so many presentations that cram too many words on a slide, where it’s clear that the presenter thinks that it’s the information that’s important – cram more on and the slide must be better.  Jobs was a master on how to focus people on his message – leading to incredible success for his companies and his innovations. 

Take a read of Guy’s lessons here – it’s worth learning some of these lessons…

Honoring Steve Jobs

Steve Jobs 1955-2011

I have been fascinated with the story of Apple ever since I first used an Apple II in school.  The first computer I ever purchased on my own was a Macintosh IIsi on which I wrote my thesis.  Our first personal computer at home was a Macintosh.

My wife and I each have iPhones.  We bought an iPad.  We just bought my 11-year old daugther her first mobile phone – an iPhone 4 (she has a better phone that we do!…).  We have purchased three iPods and have downloaded many apps, songs, and movies through iTunes.

Our lives are better through the innovation of Apple, and today is a day where it’s worth reflecting on who created that impact.

Steve Jobs passed away today – one of the great American entrepreneurs.  I can’t do anything to honor him any greater than others can.  Here are a few posts about Jobs, his life, and his impact.

A tip of the cap and a huge thank you for everything you’ve brought into the world!

Data Science Defined

So, what is data science?  There is an increasing buzz about it lately – A Fortune article recently dubbed the “data scientist” as “The Hot New Gig in Tech”.

This report is a good primer for the new field of data science.  In some ways (as with many fields), it’s not new at all.  In fact, I think my company has been involved with raising the bar of data science within the defense and intelligence worlds for over 35 years – I’ve personally been involved for nearly 20 of those.  However, with the volumes of data being generated, and the ease of being able to process this data, there is now a recognition of the value of data science.

Another recent post by DJ Patil, chief scientist at LinkedIn, discusses how he built the data science team there, and he has a lengthy discussion on what the roles of data scientists are and what to look for in building your teams. 

In fact, in the O’Reilly data science report, Patil was referenced as believing that the best data scientists:

“tend to be “hard scientists”, particularly physicists, rather than computer science majors.  Physicists have a strong mathematical background, computing skills, and come from a discipline in which survival depends on getting the most from the data.  They have to think about the big picture, the big problem.”

That has been my experience as well – the best data scientists that we find are ones that have a strong math background, whether they are majors in physics or in computer science.  Strict coders find data science a bit challenging if they don’t come armed with the strong math skills needed to pull information from the data…

Well, here’s my definition.  “Data science” is the general analysis of the creation of data.  This means the comprehensive understanding of where data comes from, what data represents, and how to turn data into actionable information (something upon which we can base decisions).  This encompasses statistics, hypothesis testing, predictive modeling, and understanding the effects of performing computations on data, among other things.  Science in general has been armed with many of these tools, but data science pools the necessary tools together to provide a scientific discipline to the analysis and productizing of data.

A complementary term that is being used currently is “analytics”, and I found an interesting and very appropriate definition on Wikipedia, which is “the process of obtaining an optimal or realistic decision based on existing data.”  Analytics are then useful metrics derived from data upon which decisions are made.

Combining data science and analytics together gives a foundation upon which new computing paradigms and data products can be generated.  Cool stuff!…

NKS Now Available on the iPad

I really like Stephen Wolfram’s book A New Kind of Science (or NKS for short) – on how simple computational programs can create amazingly complex things.  I’m also really enamored with the iPad (I’ve written about it before a number of times…).

Now, two of my favorite things are coming together – Wolfram’s NKS is now available on the iPad.  It’s available at Apple’s iTunes store for $9.99 – far less than the hardback version and certainly much lighter!…  (Hmmm…  Two things coming together – kind of like chocolate and peanut butter in a Reese’s Peanut Butter Cup…)

I haven’t purchased an iPad yet, but now I’m really excited to think about possibly maybe starting to look into getting one… (or at least window shopping for one…)

Read more about A New Kind of Science for the iPad on Stephen Wolfram’s blog here

Supercomputer Wins At Jeopardy! – IBM’s Watson

Here is an incredibly in-depth article from the New York Times about IBM’s latest articifial intelligence invention – Watson.

IBM gained fame in this supercomputer arena with their Deep Blue computer, beating world chess champion Garry Kasparov two games to one (with three draws) in a six-game match.  This was the first time a computer had ever beaten a human chess champion.  Kasparov apparently was so upset that he demanded a rematch, but IBM ended up dismantling the Deep Blue computer.

Rather than focusing on known chess moves, Watson can understand natural language, and then play competitive games of Jeopardy! by providing the correct answers (in the form of a question, mind you).  You can actually play against Watson through the IBM website (I got my butt stomped, by the way…)

Take a read of the New York Times article here.  Facebook readers can watch a video on IBM’s Watson here.

What’s In A Name? – Let Wolfram|Alpha Tell You

A recent article in the Vancouver Sun shows some interesting things that you can do with the new computational knowledge engine Wolfram|Alpha.

Author Chad Skelton decided to check out the statistics of his first name, “Chad”.  By inputting “chad” into the entry line, Wolfram|Alpha gives you a wealth of information.  Rather than return a series of links to webpages that may or may not be relevant to what you’re looking for, you get relevant information (that Wolfram|Alpha actually computes itself) on your request.

Now, since “Chad” is an African country, you can end up getting population, geographic, and other demographic info on the country.  But, Wolfram|Alpha also understands that “chad” is a name, a lake, a protein, and also a word (for all you Florida election fans out there…), so you can click on any of these to get that information.  Skelton found out, for instance, that the distribution of “Chad” being used as a first name peaked in about 1973 or so. 

Unfortunately, when I put in my name (“Mic”), I find that I’m either a unit of measurement (a microgram) or a constellation (Microscopium).  “Michael” is a bit easier to identify with a name, and still seems to be pretty popular, peaking between 1950 and 1980. 

You can read Skelton’s original article here, or play around yourself at Wolfram|Alpha.