I recently had the pleasure of attending a discussion between Steve Blank and Gary Shapiro on startups & innovation. While there wasn’t a huge amount of new insight for a San Francisco audience already steeped in lean principles, I was glad to finally pair their voices and mannerisms with all the words they’ve written.
While discussing how industries beyond Silicon Valley have sought repeatable innovation through bureaucratizing teams and processes, Shapiro shared a brief anecdote about meeting with US military CIOs. He was amazed by their sophisticated capabilities in acquiring, interpreting, and acting on raw data. In particular, he was impressed by how vehicle movement patterns across entire cities were tracked and analyzed to predict future behavior - an incredible feat only recently made feasible by surveillance drones, some capable of tracking
every moving object across an area of 15 square miles and storing
one million terabytes of video per day, 5,000 hours of HD footage, while broadcasting live streaming footage ➠.
Shapiro’s enthusiasm for the technology independent of its use particularly struck me because just a month ago I saw Mark Mazzetti, Pulitzer Prize-winning National Security Reporter for the New York Times, sit in the same chair and describe how vehicle movement patterns are used to select signature drone strike targets in Pakistan. While the exact qualifications required for signature strikes are not public, they are attacks against vehicles, individuals, and locations which
bear the characteristics of Qaeda or Taliban ➠ - in short, strike authorization is determined primarily through pattern recognition.
the CIA could not confirm the identity of about a quarter of the people killed ➠, and
the strikes have killed an estimated total of 2,600 to 4,700 people ➠. Scroll through this table of documented drone strikes and note the number of killed
Unknown, and how many targeted organizations are designated
Unclear instead of
Al-Queda. The degree of uncertainty and collateral damage is unsettlingly high considering these attacks kill people.
I don’t mean for this to be a comment on the morality/efficacy of drone strikes. Rather, it’s an observation on how disconnected data-based decision making can be from the very real human impact. It was a bit startling to see an enthusiastic Silicon Valley endorsement of capability that failed to address how data-heavy decision processes fundamentally alter our awareness of and compassion for the people affected. This shouldn’t be a binary choice, but it’s often embraced as one.
To dial it back a bit, inadequacies in even the most extensive customer and user behavior databases have sent companies back towards more humane decision making - for example Facebook’s adoption and advocacy of Data Informed, Not Data Driven. For those of you interested in the broader subject of technology trust & security, I’ve been greatly enjoying Bruce Schneier’s blog.