Wednesday, June 11, 2014

The things we don't intend to share

Well, it's been quite a while since I have felt inspired to write, however, my local NPR station got me thinking once again.

I've heard NPR call them 'driveway moments' when you are listening to a story and find yourself no longer driving and parked in your driveway with the engine running and still listening because the story is so good.

The story began with a reference to Edward Snowden, which immediately perked my ears up, but it was just an intro line.  I have heard the teasers for this story for a bit now and was interested to hear the results.  Very interesting to say the least.  Here's the story about the things that can be learned from commercial software monitoring of your technology and the data that it sends and receives.  Now, this is distinctly different from what three letter agencies (TLAs) have at their disposal, so keeping that in mind is important when considering the revelations of this story.

As of this writing, the NPR web site shows two parts to the story, but I believe from my listening to the radio that there may be more coming.

Now, I know this subject matter very well, however, it still does occasionally remind me of how much I ignore because of simple convenience, like most other people do.  This, in and of itself, is an interesting aspect of this story that really isn't called out.  That issue is how much of our privacy we give up as a matter of convenience.  This is not a characteristic of European privacy law, but that is an article for another day.  At any rate, the reporter commented on his surprise on this issue during the initial set up of the experiment and the confirmation that everything was working and some back and forth questions as to whether or not he was actively using his cell phone.  He was not, but the monitoring team was seeing substantial traffic from his phone to the internet while it was 'quietly' sitting on his desk and in lock mode.

One of the other things that this story highlights well is the 'side channel way' that adversaries can get your data.  Rarely does a compromise result from a direct frontal assault.  The story mentions that every system has old programs and it can be those programs that leak data.  Add a few more bits and pieces of micro-facts from other programs and you have a significant piece of data or a coherent piece of information or even a story about you and your information.  This results from a couple of different illustrative points from the story.  The first is the way that the 'Steve's iPhone' in the story can lead from just any Steve in the country or world, to a specific Steve at NPR in Menlo park.  Simple google search after that and you have your specific Steve Henn.  The other illustrative point is what I call the famous last words compromise...'well, I didn't expect them to do that!'  This was illustrated by the 'adversaries' in this instance using several other side channels of information that are older programs that are less likely to be considered for patching or even running in a stable fashion and not needing patching...yet leaking data and not secure.

So what is the take away?  Well, certainly we need some monumental legislative change in this country for privacy.  We need to change our legal privacy expectations from 'opt-out' to 'opt-in.'  This means that any provider of hardware or software would only be allowed to your personal information if you specifically grant them access (opt-in).  This is the basic model that European privacy law follows.  By contrast, our legislative structure allows organizations to collect our data with minimal notification in legalese, rather than plain and easy to understand language, UNLESS we specifically say they cannot (opt-out).  That simple fact alone is the foundation of much of what is the basis of our continuing software problems that allow adversaries to perpetrate their craft.

The second take away is a bit more ambitious and far reaching and could have serious economic impacts.  Strap yourself into your chair for this one....  Software and hardware manufacturers need to be held criminally and fiscally responsible for security flaws.  I'll let that thought sit with you for a moment, because it is a big one.

I'll wait.  Go ahead and ponder it.

Yes, that does mean...

Yeah, and that too...

But it also means that security would be mandated by design and become an integral part of the economic product design decisions that every maker of software and hardware goes through at some level.  When security becomes a required feature of our computer landscape and tantamount to holding car makers responsible for the proper functioning of required safety features like seat belts and air bags, you will see some effective security happen.

Until then...we will continue to have 'well, I didn't expect them to do THAT!?!?"