Losing our Sacred Data - how to make sense of Capital One, Equifax, and Facebook; plus 13 short takes on top developments
Hi Fintech futurists --
In the long take this week, I explore Capital One's massive data breach, and the penalties they are likely to face. We can compare the potential outcome to those faced by Equifax ($600+ million) and Facebook ($5 billion). A compelling framework emerges out of this analytical journey about which data we hold sacred, and how our behavioral biases may betray us.
The latest short takes on the Fintech bundles, Crypto and Blockchain, Artificial Intelligence, and Augmented and Virtual Reality are below. Thanks for reading and let me know your thoughts by email or in the comments!
Here is our factbase for this week. Capital One recently suffered a data breach resulting from poor security practices that exposed 100 million credit card applications and accounts. They expect the breach to cost the company $150 million. Two years back, Equifax lost 140 million identities, again from poor security practices. At the time, I said that according to GDPR this should cost them $150 million. They have since settled for about $600 million -- though some of that seems to be in-kind services coverage like free credit monitoring (lol!). Separately, Facebook has settled for a $5 billion fine associated with the Cambridge Analytica privacy "breach".
As a percentage of revenue, $5 billion out of $60 billion (~10% for Facebook) or $600 billion out of $3.5 billion (~20% for Equifax) seems to be of a similar magnitude. Capital One's estimate for $150 million on $28 billion seems off, to say the least. But let's get some macro data out there, before thinking more deeply about the issue. Identity and data, and in particular financial identity and data, are valuable. On average, a stolen digital human is about $200 on the black market, and the per-capita cost of a data breach to the company is roughly the same. Cyber insurance, which is in the aggregate supposed to counteract these damages for companies, is at least a couple of billion in annual premia -- amounting to probably a few dozen billion in coverage.
So here's the issue I have. There is a lazy thing to say, and I said it in 2017 about Equifax. It goes like this. Look at all those hypocrites in the large financial companies! They point to Fintechs and Crypto, the innovative parts of our economy, and accuse it of poor practices. They insist on inequitable, overly heavy-handed regulation and security expectations that stifle out young companies. And yet, only 2% of all Bitcoin transactions have anything to do with illicit activity -- no different that in the traditional economy, which sees 2-5% of GDP pass through money laundering. And yet, they keep losing our most important data by the millions, never having to face repercussions for their sins.
That's a fun, accurate, finger-wagging argument to make. But it doesn't do any work. It is useless. Instead, let's take a more systemic approach. We can acknowledge that crime, theft, and mutual destruction is a human attribute, not some externality of a technology. Yes, we would like to minimize the crime. But it is endemic to all human systems, it is a part of us. Therefore, we have to accept that some percentage of our data, money, privacy, and other valuables will be stolen, misappropriated, or destroyed. We will fight that -- but some amount, let's say 2%, will slip through. This issue is about the actors in the system itself, and today the problem is merely becoming more transparent.
The second step is to think about our rationality versus our feelings (if you want to read 1,000 pages on this topic as prep, I recommed Yudkowsky). From an economic perspective, the following two scenarios are identical. In scenario A, you lose 2% of your data with 100% certainty. Imagine this as losing a non-core credit card once per year, and then having to cancel it with the bank. Inconvenient, but nothing to worry about. In scenario B, you lose 100% of your data with 2% certainty. The expected (dis)utility of this outcome is exactly the same, but I would guess that most of us would pay way more to avoid such a problem, because we are risk-averse animals. Any chance that you will lose everything you have is terrifying -- and much harder to remedy.
Another dimension I want you to think about is "sacredness". Something is sacred, in the sense I am using the word, when the cultural significance attached to it precludes an economic discussion. For example, human lives are sacred. No amount of insurance will make up for an outcome where a person is killed! And yet, governments make these calculations all the time when evaluating policies on topics like speeding, smoking, and water safety. Further, some things are sacred to some people, but not to others. What is a political cartoon to one person, is a declaration of religious war to another. To bring us back down to Fintech and cyber security, my main point is that *privacy* and *personal data* could be sacred in one context (e.g., an American high income person that studient constitional law at Yale), and not as sacred in another (e.g. a farmer in China who gets loans from the government).
Sacredness is a multiplier on how important something is to the person within their context. For many of us, we are fine losing social media photos, Twitter puns, or even our passwords. But financial information can be much more personal and embarassing -- take for example the fact that we still do not have Donald Trump's tax returns. I would bet that he finds those to be a sacred screed. Similarly, Google has a lot of sacred data. Imagine exposing to the world all of your search history, or having that search history be the basis for eligibility to get a bank account. Ok. So with these tools, let's put together a framework.
What does this tell us? First, the Capital One and Equifax bits are negatively surprising, but in the way that losing a gambling bet is negatively surprising. We have always known that there is some low chance of loss, and we have known that the data at stake is our financial data. We took the gamble of a 2% loss on a 100% cost, and when that loss actualized, we felt badly. The outrage we see today is a response to experiencing the cost. Perhaps we thought the chance of loss was lower, or we are apalled at the technical incompetence of the humans involved in those cases. But there's nothing deeper there, in my view.
The correct outcome is to improve the quality control of the system. This can be done perhaps by forcing cloud providers like Amazon to have more safety limitations out of the box, or to move more of our information onto blockchain-based systems where individuals control their own data. At least in that case, the losses will be internalized to each individual at the time of their personal failure (lost my keys!), rather than correlated and externalized to the entire group whose data a centralized party (e.g., Capital One) is managing. But we cannot fix human society structurally just by asking people to download wallets. We cannot change our lazy, careless nature.
The second thing the framework tells us is about the scenario of 100% loss with 2% cost. We used to believe that Facebook and Google had our information, but that it wasn't particularly valuable, personal, or sacred. This is of course entirely wrong. We have learned the hard way that the Tech giants have everything; and that the more sacred it is, the more they want it. Second, we used to believe that what they have is relative secure and inaccessible to others. This too is incorrect. By opening up the honeypot to Cambridge Analytica, Facebook made it a core business practice to bleed out what we want to protect.
I would say what we have lost is the right and the ability to think our own thoughts. To make up our own opinion, crushed as we are in the maw of algorithmic advertising and propaganda.
This second thing is far worse than a hack, and should be punished far more punitively. Systemic design that takes the probability of loss and turn it into a business model is a flawed system -- and one we should abhor deeply. I don't have to persuade people to be outraged at Facebook; they already are for far less clearly articulated reasons. But this thought process has helped me identify why invisible microthefts are a problem, and how to fix them. We see Facebook adressing the issue by both (1) lowering the chance of loss by saying that the open developer program that powered up Cambridge Analytica is now closed (or better monitored), and (2) lowering the value of the loss by re-focusing on privacy and submitting itself to increasing regulation. And yet here they are, trying to start a new global currency!
The good news is that people are finally waking up to the fact that they have made a bad bargain. We recognize that the faces of our children are used to power machine vision artificial intelligence algorithms, that our location and shopping data can be used to discriminate access to financial services product, and that our searches and conversations are neither private nor fully protected. With this recognition comes a sense of cost -- how much are we willing to give up, now that we see that things are not free. Listen, all technology and human processes are fallible, and so we should not aim for perfection. We should aim at the intersection of marginal cost and marginal benefit around security, privacy, liberty, and convenience. We should assume the risk and sail into the Great Beyond.
Check out my interview on the CIOs & Bowties podcast. Here are Part 1 and Part 2. We talk about wealth management, artificial intelligence, YouTube influencers, Asian Fintech, and all sorts of good stuff.
Singapore-based wealthtech startup Bambu raises USD 10 million in Series B round. Digital wealth platforms have sprouted all over the world, both because there is need, and because the infrastructure and regulation around which they are building is different. Franklin Templeton is the driving investor here. Similarly, StashAway raises USD 12 million in Series B round led by Eight Roads Ventures. We hear good things about this company (despite its name similarity to other roboadvisor Stash), and it is backed by Fidelity International.
Northern Trust Asset Management Acquires Emotomy Platform. I don't see a transaction comp, but it is important to note when a large traditional firm acquires a digitization strategy. It's no FutureAdvisor/BlackRock, but what is?
Revolut's adding stock trading to its app to get millennials investing. Are you getting Revolut and Robinhood mixed up? You should be -- the first one is a banking app powering FX conversion, including cryptocurrency trading, and has just introduced free stock trading. The second one is a free stock trading app with a crypto investing feature, and a failed attempt at introducing a bank card. Both have raised too much money.
Walmart Wants to Patent a Stablecoin That Looks a Lot Like Facebook Libra. I applaud generally that organizations are waking up to the concept of integrating digital financial instruments into their business model, especially when it empowers clients. What boggles my mind is that people are still trying to patent ideas stemming from open source software. What's wrong with you?
Failure to [Coin]Launch – Caution for crypto-asset consultants, advisers and service providers. Hey, if you are an "advisor" to an ICO (or STO) and are marketing unregistered securities without a license, maybe it makes sense that the regulator would be upset. Also, FCA provides clarity on current cryptoassets regulation. Some interesting classification changes, and overall positive.
Japanese Sony Corp invests heavily in cryptocurrency platform Bitwala. One of the funnest reports I had done at Autonomous was looking at the history of Payments across the globe. Four types of actors are key to how the industry looks regionally: (1) governments, (2) financial companies, (3) digital hardware & telecoms, (4) software and social networks. In most regions, Telecoms and hardware providers are behind software providers. But crypto is another pass at getting this right -- thus Sony investing in Germany's crypto bank.
Machine Learning for More Equitable Credit Powers Airfox -- ZestFinance Partnership. Credit underwriting by artificial intelligence for "underserved populations" in Brazil, targeting 50 million shut out of traditional banking. Lots of interesting expansion happening in South America.
Zen and the Art of Artificial Intelligence. Great podcast with CEO of Responsive AI, a wealth management next-best-actions platform, and Craig Iskowitz of Wealth Management Today. Why is AI stupid? Because people are framing the problems it is solving incorrectly.
Leveraging blockchain to make machine learning models more accessible. Late to this news, but it is still worth calling out. To hear Microsoft launch an initiative called Decentralized & Collaborative AI on Blockchain does make my internal Elon Musk alarm bells go off (kill it with fire!). That said, it is clear that high quality automated machine judgment, and the data which powers it, will be accessible in blockchain-based software in a few years.
Are your surveys being ignored? Engage customers with voice! Hey look -- it's a machine voice robo caller.
Amazon in-car delivery expands to Honda and Acura vehicles. Sort of interesting. The idea is that Amazon could deliver packages to the trunk of your car, if your vehicle is connected to a particular software package that enables a smart car -- and the Amazon delivery driver has app access. How does this impact commerce? Will your car tip the delivery person depending on how nicely they slam the door?
Microsoft Discontinuing Major OS Updates for HoloLens 1. There's a new HoloLens coming, and it looks far more like the office of the future, with a wider field of view. Magic Leap is starting to apply some pressure in the space.
Zynga Debuts Pagini Hypercar in Augmented Reality via CSR 2 Mobile Game. What if you were making a big luxury product announcement, and your first channel for getting the news out there was an AR video game? How would a user preview or buy this product -- worth considering.