Categories
discourse

The Jig Is Up

I was listening to a recent episode of the Tech Won’t Save Us podcast today, the one with Ben Tarnoff. He was leading with a historical point from his book on the Luddites that the famed workers weren’t anti-technology.

Contrary to the received knowledge that they’re anti-technology, which is often used as a proxy for anti-progress or a refutation of modernism, their protest was simply the destruction of property. Very much knowledgeable about tech, and future-thinking, destroying property, some of which happened to be the machines and materials of newly built partially-automated factories was a stand against devaluing human knowledge, effort, and ability. Knowing their livelihood was at stake and the only real value was to the owners of the factories, they fucked shit up.

What could be more familiar, after all, for rich men, than for people who have nothing to lose to destroy their property? Factory owners, surely friends with the people who own the printing presses and the paper mills that, via newspapers, are the only source of information to almost anyone who works. The people with the ability to control the public narrative, when there’s money at stake, do. Tarnoff comments on this, noting the parallels with the authoritarians today who would make anyone protesting police violence and demanding justice into mindless and violent thugs.

Trivializing the destruction of property and making it a moral failing never calls into account the source of what the law-and-order types are so pissed about: Fear. The folks whose fortunes and property are ill-got, and depend on structural inequality are scared that the jig is up, so they go on the offensive.

This is not news to most, I’m sure, but it just doesn’t fail to amaze me that power structures impose themselves in the same way over and over again. I just happen to have read Ellen Pao’s Reset and Chiamanda Ngozi Adichie’s We Should All Be Feminists this week, but the core of both of these books is that people in a position of power, even seemingly friendly ones, will not hesitate to act out when their privilege is threatened.

What is surprising to me is that this story is just repeated again and again. We have a president who has innovated in the public discourse by using pure projection and constant chatter to avoid accountability. Why do we do the work of accounting for them while they move on to the next attack? By the time we’re exhausted, their children will just take their place.

We need to talk to each other, instead.

Categories
stats don't lie

Now with 10% less hate speech!

My grandpa had a joke about being a worker in a world that cared only about business people. He would ask you, “How’s the business?” and then reach out his hand to shake yours. If you reached to shake it, he would take his own hand, and shake it, laughing at you.

I had to laugh reading the social network everyone hates’ report on how well it’s shaking its own hand. Actually, I laughed reading a digest reporting, “The tech giant says 88.8% of all the hate speech it removed this quarter was detected by AI, up from 80.2% in the previous quarter.”

What does this report, exactly?

“Of all of the sandwiches I hate this quarter, 88.8% were delicious, up from 80.2% in the previous quarter.”

Do you get a strong sense, in the grand scheme of my life how many sandwiches I ate between Jan 1 and Mar 31 2020?

What about the total number of things I ate?

Is it very helpful to note my definition of “sandwich” includes hot dogs?

How about if I tell you the number of hot dogs I ate was zero?

We’re doing great on delicious!

Categories
musical chairs

Shitty Copy of a Pretty Good Copy

Last night while I was walking the dog I listened to the late, great, Yellow Swans, and I continued the vibe today as I was working, searching YouTube for live clips. I’ve heard all of their albums and never had the opportunity to see them live. Noise bands have lots of clips on YouTube, so it was easy to find a full show and press play.

Kind of zoning out as one does when listening to music at work (ie. not actually listening closely) I didn’t really pay attention when a very simple and repetitive recording came on next. Ambient music and noise music are tonal colors in the same sonic palette.

My ears perked up however when this unexpected sound came on toward the end of Low Level Listening, Part 3 by Stars of the Lid.

This is of special note because YouTube is a specifically digital medium. It’s not weird to find digitized version of analog recordings on YouTube of course. Virtually none of the music that was previously released on analog format has been digitized; only what copyright owners (usually not artists) thought they could make money from. Since the Stars of the Lid are an independent outfit making music with very little commercial potential, it didn’t surprise me to find someone had recorded their lp and put it online.

As I’ve also seen, YouTube seems to make an effort to identify copyright owners and license these recordings. Despite someone who’s not the band recording and uploading the digitization illegally as well as violating the terms of service of the streaming provider, the platform contacts the owners and asks if they’d like to leave it up there and collect any revenue it generates.

I could guess this is a strategy that portrays the company as acting in good faith for all of the times they generate and collect ad revenue on someone else’s work. And I totally get there are possibly some challenges for them to identify artists as well. Maybe it’s kind of cool?

Except there is a digital version of this recording. I checked Spotify and Apple Music. That version is not this 2nd generation digitization of an lp.

In fact, every digital copy is probably an imperfect copy. We have this idea that digital transmission is perfect and complete reproduction, but more accurate is it’s possibly perfect, likely complete reproduction is what happens. That’s what “sample rate” means. At any given instant, the playback algorithm is pretty sure this is what it’s supposed to do.

We just become aware of when it’s fantastically wrong, when it sounds glitchy, but statistically it’s predictably wrong, some percentage of samples. Our ears just aren’t good enough to hear it.

Digital just make it easier to copy, which means it’s easier to make crappy copies faster. I didn’t notice this recording sounded like a recording because it’s an ambient album by a band that makes washes of sound. It’s quite common for ambient bands to add non-musical sounds or even noise for effect, aesthetic, or any other reason.

But it belies the fact that the second I was paying attention, I realized this is a far inferior copy to any other I could be listening to. The Uncanny Valley just keeps replicating itself out there in the digital surveillance economy, but we accept the convenience of it because, well, in my adult life I remember a time when it wasn’t even possible to get access to an unlicensed live recording of the Yellow Swans. Hell, if you wanted an authorized one that was pretty tough, and most of the labels that have ever released their music are either defunct or dormant. So this is fucking great!

There it was though, generating revenue for YouTube (somehow, I’m sure; my browser blocks ads, but that doesn’t mean someone’s not making money) and maybe, someday a fraction of a cent for band, from a shitty copy of a pretty good copy.

Categories
design facial recognition marketing you know, for kids

No cam. No mic. We found other ways to surveil your children.

Projection is not only a defense mechanism where we rationalize the world by identifying behavior of others as being motivated by what motivates us, say trauma or abuse. It’s also how marketing works. When you do it deliberately, it’s called “advertisting” or “business development” or “advertainment” or whatever the tech news calls itself these days.

However, even when it’s done deliberately, the mechanism that fuels the intention and the enthusiasm for an idea still comes from somewhere in your brain that not easily understood, and is desperately hungry, all the time. Your id breaks through and tells us what’s really going on, and you don’t know it because you think just because you’re using your rational brain – you know, to make an ad campaign for a smart speaker for children that supposedly avoids the problems of surveillance capitalism by having no mic, no camera, etc – you don’t know you’re telling on yourself.

The Yoto smart speaker is a device that connects to the cloud to deliver content to pre-verbal children. “No cam. No mic. No funny business,” is an interesting claim if you believe they’re projecting what they believe when they’re asking you to believe something about them. What funny business do you mean? Are you saying it’s a completely offline device that delivers new content without having to purchase cartridges or tapes or cds? Because that’s awesome.

In fact, I had one myself and I loved it. It trained me to handle and fetishize my parents objects so I could learn to consume them, but that’s cool. I like music.

No, Yoto just wants to collect, store and monitor your child’s behavioral data, just like everyone else. “Parents can also upload content they select (say, songs from a playlist, or a certain audio book) to blank cards using a parent app; the cards work using NFC technology, like a contactless credit card, that link to content stored on Yoto’s servers.”

Probably sell it too, since many companies who do the former do the latter; some only do it to enable the latter. But we haven’t even looked up the founders of the company yet.

Elizabeth Bodiford has a nice way of describing this kind of behavior in her poem, We Tell On Ourselves:

We tell on ourselves by the way that we walk.

Even by the things of which we talk.

Categories
government

When someone shows you who they are, believe them the first time.

Today at work in a What Did You Do This Weekend conversation, my friend was telling me about making calls for a candidate in the upcoming primary.  Talking about how satisfying it is to hear someone go from disenfranchised to sounding eager to register to vote, she was like, “If only I could get my mom to vote.”

Her mother was born during World War II, in an Eastern European country that doesn’t exist anymore, and told her that she’s never voted anywhere she’s lived because it’s not safe to vote for candidates that are truly liberal and propose policies that are against the status quo.  “She gets so happy hearing me tell her about calling to register people to vote, or tell them about Bernie, living vicariously through me,” she explained.

“So I’m like, ‘Ma, you can do it too.  Let’s go and register you to vote this weekend,’ and she’s like (my friend makes a swatting motion, as in total dismissal), ‘Are you crazy?  It’s risky to vote in primaries.’ (making a sour face and another swatting motion) ‘The whole point of a primary is the government is tracking who you would vote for before you vote for real. No way,'”

The funny thing about this 80 year old saying voting is risky is she has lived experience that informs this.  Born while Hitler was in charge.  Grew up in a puppet regime that Stalin was in charge of.  Worked and travelled most of her adult life in other countries where the government was obviously installed by another government, the military, or was controlled by a tiny elite of power brokers and oligarchs.

Even though it’s been dialed back a bit in the past couple of decades, what she’s saying is, “Yeah, I get you live in the United States, and you grew up when this sort of thing was on the wane where we lived and where we live now, but do you think the fascists stop fascisting just because you like funky sweaters and protest marches?”

Sorry, that’s flippant and this is real.  What she’s saying is, “Have you ever even seen a fascist regime?  I have.  It looks like this,” and spreading her hands like a game show host showing a you the fabulous prizes all around you.

The title of this post is a well-known quote from Maya Angelou. The someone I mean is the military-industrial complex.

I thought it was ridiculous to consider that the world could ever be any way other than what I could see at the time I was seeing it. Even though I grew up with people who weren’t citizens of any country in 1945. Following the defeat of the totalitarian regime that took them prisoner for 6 years, another totalitarian regime was annexing their country of origin.

This is Poland we’re talking about. I was born with citizenship where I was because the largest mining company in the world did things like sponsor people for citizenship and pay their emigration fees in exchange for a lifetime of working in a mine.  Even though my family mostly had firsthand lived experience that was not much like mine when I grew up, and they even had lots of evidence suggesting things seemed ok, my grandparents though it foolish to be anything but vigilant and skeptical. I thought this was ridiculous, then.

75 years ago Auschwitz was freed.

50 years ago the Rev. Dr. Martin Luther King was murdered.

20 years ago we started a war because of “weapons of mass destruction,” that didn’t exist.

12 years ago we held up signs that said CHANGE and graffiti artists immortalizing the first president to order over 500 drone strikes, killing over 500 civilians (according to The Council on Foreign Relations).

20 months ago the president who used to be a game show host revoked the law requiring the the government report the number of civilians it killed when they were trying to kill people they thought weren’t civilians.

My grandmother was 12 when WWII started, and lived in a tiny remote village in a country annexed by the Nazis.  She thought it was ridiculous how kids like me violated her social norms with stuff like our hair styles and noisy music, and breaking minor laws easy for privileged white kids to get away with. Not because she was offended, but because that would get you killed where she grew up.

I even knew about and believed things like MK Ultra and the US support of the Khmer Rouge. I was the kid would could tell you Coca Cola, Ford, and GM profited enormously from supplying Nazi Germany with soda pop, trucks, and planes, among other things.

I thought it was ridiculous.

What an idiot.

Google, Microsoft, Facebook, and Amazon want to help the government decide how to regulate AI.

Google, Microsoft, Facebook, and Amazon sell the government law-enforcement, military, infrastructure, and marketing software tools powered by AI.

If you think it’s ridiculous that conflicts of interests are of concern there, or there’s potential danger to the liberty and safety of people as a result of these agreements, that’s okay.

You’re not ridiculous.

You’re not an idiot.

Prajna is a journey, not a step.

Categories
marketing

Shoshanna Zuboff’s in Sunday’s NYT

I have lots to say about Shoshanna Zuboff’s piece in this week’s New York Times, but it’s late and I thought I’d give you a chance to look it over before I say anything about it. Enjoy. It’s a great primer to her weighty The Age of Surveillance Capitalism (and a lot more accessible).

If you need a laugh, even if your sense of humor tends toward the acid I don’t recommend reading this thing I stumbled on last week, Microsoft’s book-length ad for buying AI from them. Though it’s an interesting companion piece to contrast Zuboff.

It’s a real barrel of laughs. No kidding, I bought it just because it will be funny in just a few years that a paperback was published. I’m laughing all the way to Satya and Jeff not even needing a device to track me down because in fact I bought a physical book from an online retailer, which was tracked from first click to a picture of it being taken and sent to me to prove it was delivered.

Categories
design

Chinese Finger Trap

I was in a hallway conversation where a designer peer who has been around the block asked for my take on a problem.  It was about changes the surface representation of a given digital experience – let’s call it a “skin” though we were talking about a speech interaction.

If the way that you apply that skin takes a couple of steps for procedural reasons, such as identification, authentication, purchasing, and changing default settings.  The way this system is architected, when you got to the end, you got a confirmation saying, “hey, I applied that skin like you said, and that’s how it is now.”

The question was, “What do you do if the user says, ‘oh hell no change that back!'”

So I asked:  Are we sure the user wanted to do the thing?

Well, this is the fourth step of a flow so it would be fairly difficult to get all of the way there with false accepts at every step so, yes, let’s assume so.

The next thing I asked was:  And the assumption here is that because the thing is both a purchase and a pretty specific choice – you don’t buy an Andrew Dice Clay comedy album or a Scientolgy text by accident, so is this more like that?  Or more like ordering in a restaurant, “actually, could I have the salad instead?”

Definitely Andrew Dice Clay.  You picked this on purpose and did work to get it.

So what is the concern?

Well, some people who we talked through the design were like, “How do I cancel out if I suddenly realize I did the wrong thing?”

My final question I kept to myself.  We talked through ways that you could identify and capture ranges of intents and do daily or hourly log queries, some general tech capabilities we might be able to apply here, and that was that.  Back to work.

My final question was, “Why did we build a Chinese finger trap?”

Ignoring the ways you might implement a system to use progressive disclosure, familiar words, legalese tick boxes and numerous steps to ensure that a user would not even end up in the situation where this problem were possible, my primary concern was that the people who had driven the product from the beginning had built it as a trap.

Get the revenue!  Get the impressions!  Get the clicks!  Get the engagement!  Roach motel it!

(Obviously these folks would not say “roach motel it” – Except for people who straightforwardly adopt the most coercive tactics of Hooked and other manipulation textbooks, most people in my experience solving this problem this way are merely doing what their boss said.  If they were a role playing game character they’d be “Unprincipled” or “Neutral”.)

The real problem they were trying to solve was once they saw the implementation-level experience, they saw the user would see it was a trap, and back out.  The problem was it wasn’t deceptive enough.

But it was too late to build it differently, so now we could only bolt things on at the end and say they represented safety and choice.

Like seat belts in 1968.

Build a velvet rope and an exit door to the roach motel if people might decide they don’t want to stay.

Categories
facial recognition

You Sure As Shit Can Ban Technology

In this week’s Sunday New York Times there’s an article about Clearview AI, a company taking advantage of a lack of regulation in personal and personally-identifying data to market a facial recognition application to law enforcement agencies.   The basic function of their product is to input a picture of anyone and output potential identity matches, including photos, confidence scores and links to the matched data.

The  article quotes David Scalzo, founder of a venture capital firm that was an early investor:

“I’ve come to the conclusion that because information constantly increases, there’s never going to be privacy,” Mr. Scalzo said. “Laws have to determine what’s legal, but you can’t ban technology. Sure, that might lead to a dystopian future or something, but you can’t ban it.”

AI is particularly ripe for this kind of ploy, the type that master manipulators who throw up their hands and say, “We never broke the law!” depend on for their arguments to have something soft to land on.  Even if it’s bullshit, is softer than the cold hard ground.

More importantly, it’s a long-term strategy, where that something soft is if you say it often enough in public, and other people are saying it too, it has the potential to become true. Our human brains have a hard time dismissing information that at the time it was taken in, it was not in focus. But we absorb background information just the same as anything else we take in, we just don’t act on it the same way.

Like mercury in fish, it accumulates familiarity in journalism’s fatty tissues when you’re not paying too much attention, and only when the accumulation is at a critical level is there any discussion, after it’s too late. Polluters, bankers, human traffickers, fossil fuel purveyors, that kind of character (Scalzo run a private equity firm).  The argument goes something like this:

  1. State as fact a state of the world that has to be true for your product to be acceptable.
  2. State as fact that this state of the world is inevitable
  3. State that while there’s always bad apples, it’s for the courts to decide, later, maybe.
  4. Restate 1) and 2) as a final coda.

In this case of this statement, it’s demonstrably bullshit.  Clearview AI claims to have scraped over 3 billion photos for their database.  Yes, with a b.  As widely reported in 2019, the question of the legality of scraping sites for training data is not whether it’s legal – it’s not, except in certain circumstances involving Creative Commons licenses. (And even CC is speaking up to say, “woah, that’s not what CC is for, Yahoo and IBM“)

There’s nothing about information a priori that creates a privacy threat for individuals, only the value of individuals’ personal data to commercial and institutional interests.  It doesn’t matter what Scalzo’s opinion is anyway, since he has a vested interest.  

Importantly though is that you can absolutely ban technology.  You can ban it all day, and it’s done all the time.  Nobody is allowed to own a silencer for a pistol, a Cruise missile, a radar detector, or to set up a microphone, x ray, and camera array around your house to monitor your movements.  The thing that bans this is laws.  The law states “this is not allowed” and almost everyone who would have done it for some anti-social purpose just because it was allowed would then not do it because if they got caught they would get in trouble.

You can’t ban the idea of a Cruise missile, but that’s fine.  The idea might be used for all kinds of things.  The idea of matching a picture to a database of other pictures as a form of search already has all kinds of miraculous uses, and in essence isn’t that different from searching for text, whether you do it online or open a phone book to P to figure out what pizza place you always call that you only know by seeing the ad on the page.

Is it difficult to ban the nefarious and anti-social use of software for the purposes of making money?  Probably. (I’d say ask China, but I don’t want you to)  But the first step is doing it, and then it’s not okay to do it anymore.  David Scalzo and Clearview AI want it to be okay.

It’s not, and will never be.  But if we pay attention and talk about these issues, and if we demand the attention of our politicians and law enforcement agencies to be principled and thoughtful and push back against any technology that infringes on any individual’s personal liberty or safety just because we haven’t codified laws yet.  Laws codify what people like us think, and when we say we think that this kind of behavior is not okay, laws can ban technology just fine, thank you.

Categories
musical chairs

The Horn Bearer, Part 2

Apple music has an automatically generated playlist called For You that (from what we can understand by using it) is based on two things:

  • Artists whose music you’ve added to your library
  • Artists you’ve listened to as a result of search or some other non-Apple-curated function

Interestingly, it doesn’t matter whether you’ve ever listened to the music by those artists. If you add an album to listen to later, but never actually play it, this album will figure into your For You playlist at least until you remove it.

More interestingly, it doesn’t matter whether the digital entity it adds to your playlist is a song.

In my case, For You included a track called The Horn Bearer Part 2.

Melvins released the cd and digital editions of their album “The Maggot” with all songs split into two tracks.  And playing the album in entirety, it sounds continuous, and the listener hears the entire songs as intended.

Apple put a song by them from that album that I like very much – The Horn Bearer – into my playlist. But they didn’t put the whole song in.

Just the fragment called “Track 12 The Horn Bearer Part 2.”

That’s not a song, but Apple Music would never know the difference. But to the listener it’s a song that starts in the middle. An error. A glitch.

By design, because the customer is not me, and the purpose isn’t listening. The purpose is encouraging consumption, and the customer is the catalog provider and Apple.