Observing the public conversation around FB, and the private ones happening among techies and ex-FBers, I think the mutual misunderstanding is worse than when I set out two years (and 500 pages) ago to (in a small way) bridge that gulf.

We're basically fucked.

The tech world has gotten so huge, self-reinforcing, and insulated from reality they can no longer even vaguely look at themselves (and their actions) as others do. They just live on a different planet than most people.
Conversely, the average tech consumer doesn't understand the technology that has slowly taken over their lives, and their designated emissaries to figure it out--politicians, pundits, regulators, journalists--understand it barely better than they do, and have their own agendas.
To say more than generalities for a moment, here's what I think is likely the core problem.

Techies take weird, improbable visions, and make them realities: some BS pitch deck to a VC, mixed with money and people, really does turn into some novel thing.
Most people work inside a legacy industry that's evolved that way over time (usually for good reasons), and they think about the future via some analogy with their present (which is a function of a long-ago past). The interruption that tech will introduce is often hard to grasp.
We have techies who are technically skilled and motivated, but who (and I'd be the first to admit it) often have narrow educations that don't let them see a bigger picture. And we have people who live in the world who don't understand technological implications. That's our mess.
Let's take a concrete example of how this shakes out. Not sure why I'm doing this, as I'm done with screaming into the FB media tornado, but I'll just go ahead:
Disinformation is not a solvable problem. It is here to stay, forever. Every election in the foreseeable future will feature massive amounts of user-generated disinformation. The only hope is to culturally adapt, as we did to other weird aspects of social media.
Why do I say that?

Facebook & Co. can take on the most egregious disinformation examples, or efforts undertaken by identifiable state actors (maybe), but it will never be able to shut it down entirely.
Assuming some semblance of free speech, ubiquitous online identity, and some amount of engagement-optimized distribution (even if crude and self-selected, like on WhatsApp), and global reach, we will always have it, full stop.
No techie I've spoken to--I'm talking people who've spent years inside FB or TWTR--think it's solvable at scale, and anyone who says so is blowing smoke up your ass.

Why do I feel confident in this assertion (that I'm sure will get trolled)?
Remember privacy? Remember how that was the biggest angle on the FB story, and how many rivers of electrons were spilled in talking about it?

Where'd that end up? Nowhere. We got GDPR, which is pointless, and if anything solidified FB/GOOG's position in Europe. Ditto CCPA.
Privacy didn't get 'solved', we merely shifted culturally to accommodate new notions of it, and now we don't think about it much (even the Privacy Industrial Complex that made a career out of this has pivoted to being a new Disinformation Industrial Complex).
Think I'm being glib and dismissive? Let's take a historical perspective.

If you sat down to a meal in the 80s, and took out a camera and took a photo of your food, while telling everyone you were sending copies to your friends, you'd have been locked up in an insane asylum.
And yet now 'Stories' (which FB ripped from Snap) is basically that, and one of its most popular features.

The Beacon scandal that blew up FB in the late aughts now seems like a joke. People got worked up over that?

We'll read the current disinformation coverage the same way.
You can see the shift in polling by generation cohort. Those raised in a world where smartphones and ubiquitous sharing are just givens think about it very differently.

It's the bridge generation (looks in mirror) that's mostly freaking out about it.

https://t.co/LqB2xNe7Cw
Note, I'm not dismissing disinfo complaints. It's clearly a real problem that's produced human suffering in places like India or Brazil. I'm questioning our ability to do anything about it at scale, while still maintaining the technology that is (i.e. forget Butlerian Jihads).
Nor am I saying there's *nothing* anyone can do about it. FB policing (or trying to anyhow) political advertisers much more severely *is* a solvable problem, and one they should undertake (and be taken to task if they slip). But that gets back to my earlier point....
Which is it's hard for anyone to discern what's worth worrying about with this immense gulf. The techies don't see the bigger picture, the public doesn't see the disruptive vision, and the chattering classes are wrapped up in exploiting the very spectacle they claim to deride.
So, we'll muddle through, as we've always done. It'll get worse before it gets better. Mistakes will be made, and then doubled-down on, again and again.
We as a species are dumb. We don't learn anything, and only technical and scientific knowledge is cumulative.
Doubt me? Compare the conversations on this service with one of Socrates' dialogues. Are we smarter now? More respectful in dialogue, more clever in our conclusions? I don't think so. We (or some us) just know how to make things like smartphones now. Best of luck. We'll need it.

More from Antonio García Martínez

More from Tech

A brief analysis and comparison of the CSS for Twitter's PWA vs Twitter's legacy desktop website. The difference is dramatic and I'll touch on some reasons why.

Legacy site *downloads* ~630 KB CSS per theme and writing direction.

6,769 rules
9,252 selectors
16.7k declarations
3,370 unique declarations
44 media queries
36 unique colors
50 unique background colors
46 unique font sizes
39 unique z-indices

https://t.co/qyl4Bt1i5x


PWA *incrementally generates* ~30 KB CSS that handles all themes and writing directions.

735 rules
740 selectors
757 declarations
730 unique declarations
0 media queries
11 unique colors
32 unique background colors
15 unique font sizes
7 unique z-indices

https://t.co/w7oNG5KUkJ


The legacy site's CSS is what happens when hundreds of people directly write CSS over many years. Specificity wars, redundancy, a house of cards that can't be fixed. The result is extremely inefficient and error-prone styling that punishes users and developers.

The PWA's CSS is generated on-demand by a JS framework that manages styles and outputs "atomic CSS". The framework can enforce strict constraints and perform optimisations, which is why the CSS is so much smaller and safer. Style conflicts and unbounded CSS growth are avoided.
A common misunderstanding about Agile and “Big Design Up Front”:

There’s nothing in the Agile Manifesto or Principles that states you should never have any idea what you’re trying to build.

You’re allowed to think about a desired outcome from the beginning.

It’s not Big Design Up Front if you do in-depth research to understand the user’s problem.

It’s not BDUF if you spend detailed time learning who needs this thing and why they need it.

It’s not BDUF if you help every team member know what success looks like.

Agile is about reducing risk.

It’s not Agile if you increase risk by starting your sprints with complete ignorance.

It’s not Agile if you don’t research.

Don’t make the mistake of shutting down critical understanding by labeling it Bg Design Up Front.

It would be a mistake to assume this research should only be done by designers and researchers.

Product management and developers also need to be out with the team, conducting the research.

Shared Understanding is the key objective


Big Design Up Front is a thing to avoid.

Defining all the functionality before coding is BDUF.

Drawing every screen and every pixel is BDUF.

Promising functionality (or delivery dates) to customers before development starts is BDUF.

These things shouldn’t happen in Agile.

You May Also Like

THREAD: 12 Things Everyone Should Know About IQ

1. IQ is one of the most heritable psychological traits – that is, individual differences in IQ are strongly associated with individual differences in genes (at least in fairly typical modern environments). https://t.co/3XxzW9bxLE


2. The heritability of IQ *increases* from childhood to adulthood. Meanwhile, the effect of the shared environment largely fades away. In other words, when it comes to IQ, nature becomes more important as we get older, nurture less.
https://t.co/UqtS1lpw3n


3. IQ scores have been increasing for the last century or so, a phenomenon known as the Flynn effect. https://t.co/sCZvCst3hw (N ≈ 4 million)

(Note that the Flynn effect shows that IQ isn't 100% genetic; it doesn't show that it's 100% environmental.)


4. IQ predicts many important real world outcomes.

For example, though far from perfect, IQ is the single-best predictor of job performance we have – much better than Emotional Intelligence, the Big Five, Grit, etc. https://t.co/rKUgKDAAVx https://t.co/DWbVI8QSU3


5. Higher IQ is associated with a lower risk of death from most causes, including cardiovascular disease, respiratory disease, most forms of cancer, homicide, suicide, and accident. https://t.co/PJjGNyeQRA (N = 728,160)
"I lied about my basic beliefs in order to keep a prestigious job. Now that it will be zero-cost to me, I have a few things to say."


We know that elite institutions like the one Flier was in (partial) charge of rely on irrelevant status markers like private school education, whiteness, legacy, and ability to charm an old white guy at an interview.

Harvard's discriminatory policies are becoming increasingly well known, across the political spectrum (see, e.g., the recent lawsuit on discrimination against East Asian applications.)

It's refreshing to hear a senior administrator admits to personally opposing policies that attempt to remedy these basic flaws. These are flaws that harm his institution's ability to do cutting-edge research and to serve the public.

Harvard is being eclipsed by institutions that have different ideas about how to run a 21st Century institution. Stanford, for one; the UC system; the "public Ivys".