“The Tyranny of Visibility” | Future Commerce

Retail Online Training


Kyle: [00:00:00] To me, it’s like that tyranny of visibility is the bigger part of it where not everything has to go to everyone. Bands can be popular in one city without being popular in all cities. You can like a restaurant on your block because it’s on your block and because it’s specific to itself without it needing to become a meme on Instagram. So I think taking away that elitism of tastemaking and instead just being, like, you should appreciate what’s around you and participate in the cultural ecosystem that’s around you.

Brian: [00:02:04] Hello, and welcome to Future Commerce, the podcast at the intersection of culture and commerce. I’m Brian.

Phillip: [00:02:09] And I’m Phillip. And, Brian, this is a kind of a big moment for Future Commerce in that we finally have, I think someone who feels very spiritually aligned with us, to come and talk about some of these things that we’ve been resonating on for a few years. Of course, in the B2B space, amongst people in the commerce industry, algorithms, are lauded as the future of commerce technology-driven recommendations, the kinds of things that lump you into look-alike audiences with other people so you can behave just like them is a thing that has been praised endlessly. We’ve been the voice in the wilderness talking about what is the real impact of that. And the person who has now put pen to paper, metaphorically speaking, to actually capture what some of the impacts are on culture, is here with us today to talk about their new book. Kyle Chayka is a staff writer at The New Yorker where he writes a column on digital technology and the impact of the Internet on social media and culture. In his new book out now, Filter World, talks a lot about the impacts that algorithms have on what he says is Flattening culture. Welcome to the show, Kyle.

Kyle: [00:03:14] Thanks for having me.

Phillip: [00:03:16] Yeah. And you’ve been on quite the tour after writing this book. I’ve heard a lot of your interviews, but for those who aren’t familiar, can you give us a brief download on the point of Filter World and what you think the impact of algorithms are on culture?

Kyle: [00:03:30] Yeah. I really wrote this book because I, like many people, had been on the Internet a lot over the last decade and kind of experienced the rise and dominance of these huge social media platforms that have absorbed so much of our attention. And in the last few years, I’ve just been observing how all of those feeds and recommendations and the digital spaces that we exist in, they don’t seem to be so inspiring anymore. They seem to be serving us stuff that might be irrelevant or boring or redundant. And, I think, I and many other people have gotten frustrated with the kind of state of the Internet. And so I wanted to write this book to kind of figure out how we got here. How over the 2010s did these huge digital platforms emerge, and why are they structured the way they are, and why have algorithmic recommendations and feeds become the main way that we experience so much of culture, essentially, on the Internet. And the book is about culture, which is a very broad term. To me, culture is music, design, commercial products, visual art, and kind of anything from entertainment to a painting, I think, is now influenced by its existence on digital platforms.

Brian: [00:05:03] Yeah. That’s incredible. I kinda wanna start at the end because I feel like that’s sort of a new beginning for you. You got disillusioned with this, and in the book, you document how you actually leave the whole world of the algorithmic recommendations and just kind of take a break from everything. And then eventually, I think you documented back on. How has it been? When did that happen? And so how long has it been since then? And what is the impact on you personally? And I think that would be a good way to get into some of the stuff you talk about in the book.

Kyle: [00:05:44] So I think I did this algorithm cleanse, as I called it, starting in, I think, September 2022. So this was in the midst of writing the book. And a book takes a very long time as we all know. So I have been working on the book in various ways since about 2020, though the thoughts that go into it go back much farther than that. So I was in the middle of writing the book. I was definitely overwhelmed by the Internet, and I wanted to figure out some way to find a solution to this problem that I was observing. As I was writing the book, I was documenting the negative influence of algorithmic feeds, the kind of obsession we have with digital platforms. And I didn’t wanna be so pessimistic. I wanted to offer people some kind of answer to that situation or an escape route. So I figured that I should try this myself. I should just try to totally get off all of these things and experience life and the Internet actually without algorithmic feeds. And so I just deleted all the social media accounts that I had. I got TikTok off my phone. I stopped listening to Spotify, and that lasted for about 3 months at the end of 2022. It sounds so mundane when you talk about it. Like, Oh, yeah. Sure. You deleted all these apps for 3 months. But it really was a way to reset my relationship to the Internet and my habits of discovering things for myself. I could no longer rely on Twitter or Instagram or TikTok just constantly feeding me new things. And it was surprisingly difficult. I mean, it took a few weeks, I think, even to just adjust to the lower level of stimulus that was happening in my day to day life.

Phillip: [00:07:39] There’s an interesting, I’d say, recurring theme throughout the book where it seems like there’s this idea of friction, that we incur by having to go out of our way to not use algorithms. And part of your cleanse sounds like there’s a lot of friction that was incurred on your part, in your life and having to seek out some of the things that still filled in the moments of your life, still finding new music or finding the news. For instance, I think in the book, you mentioned that you were sort of behind on the news cycle after Queen Elizabeth had passed. And so [00:08:17] not being sort of plugged into the matrix doesn’t mean that your life and the things that fill it in changes, it means that you’re enduring more friction personally. [00:08:27] And I’m curious if, in your perspective removing the friction in a lot of choice making online is what’s reduced us to this homogeneity that you keep talking about throughout the length of the book.

Kyle: [00:08:44] Yeah. Friction really became a big theme of this book, and I didn’t expect that. I wasn’t thinking so much about friction as an idea when I started out, but it really became this running theme. And I think friction, as you’re describing, it’s the difficulty of doing something. It’s the actual labor or time or thought that you have to put into making a decision or choosing one thing over another or even consuming content on the Internet. And all of these systems are really designed to get rid of as much friction as possible. TikTok, the TikTok For You feed is the most frictionless experience of consuming content because it’s like your only action is flipping to the next video. You are only making subconscious choices. It’s more passive even than cable TV in a way. So I think once I got rid of all those frictionless vehicles, those frictionless content shoots, then I had to work through it myself. I had to put in the work and get over that more friction-full. What would we say? The slower, more textured experience of consuming things. So that could mean, you know, looking at your New York Times app, which is a slower, more intentional process than looking at Twitter. It could mean going to a bookstore and looking through the magazine rack. It could mean going to an art museum rather than looking at Instagram. It’s more about making your own decisions and slowing down your process of consumption.

Phillip: [00:10:37] Brian, the whole job in eCommerce in particular over the last 20 years has been about removing friction. Right? We don’t want people to think. We just want them to buy. And now that commerce is wherever these points of inspiration are in these frictionless media consumption platforms, we’re starting to see a lot of rise of the thinking around, well, how do we get people to buy as frictionlessly in those platforms as they do consume content? And I feel like that’s where, Kyle, you have this really killer intersection here with how certain media properties and certain, you know, pieces of media, I think you come back to over and over again, Emily in Paris, and how it’s sort of a stand-in and very modernist take on what ostensibly would have been Sex and the City at a prior moment in time and culture, but this is particularly of our generation and for us, and it’s exactly what we deserve almost. That’s my paraphrase of you. But it’s interesting because it is built as a piece of media to also be frictionless, but when brands get involved and it becomes a vehicle for us to make purchase decisions, I have to only imagine at some point in the future that we’re looking at those as more consumer shopping vehicles or consumer inspiration vehicles than they are as vehicles for storytelling and for escapism. Right?

Kyle: [00:12:06] Yeah. I mean, the storytelling seems yoked to consumerism in Emily in Paris in particular. I have many thoughts on this, but TikTok shopping. A lot of these things have happened in the aftermath of me writing this book. I had to finish writing it in 2022, essentially. So we’ve had a full year of the Internet and of content since I wrote it, which is, like, you know, 10 years in Internet time or whatever. So we’ve had two things. We had the TikTok shopping, which really does make shopping much like consuming content. You’re watching a video and frictionlessly buying the thing you see in the video. And we had the most recent season of Emily in Paris, which was basically sponsored content. {laughter} There were so many brands partnerships in Emily in Paris that it just it was staggering. The whole thing appeared to be a commercial. There are car brands in it. There were real fashion brands in it wrapped up in this story that’s about marketing, about a woman who is a marketing person and an Instagram influencer. It was, I mean, the snake really ate its own tail with that season where the content glorified the consumption and the consumption produced the content. It was a crazy, crazy thing.

Brian: [00:14:25] Oh my gosh. Consumption for the sake of consumption. I think, to your point about TikTok shops and this idea of removing friction from the buying process. You bring up this idea of sandcastles chasing the algorithm and that frictionless process. I think you did a great job of calling out how dangerous it is to just create content for that feed and for those platforms that feed constantly feed people things. For brands, you know, in our audience, the problem with chasing something like that is it’s just as easy to move on as it is to make a purchase. So in our industry, there’s movement, and we’ve written about this and talked about this. Actually, maybe you do wanna qualify your buyers because if you don’t, they might not be a customer. They might just be a purchaser. And so, yeah, I think you do a great job of sort of identifying those sandcastles and those things that people build up, and they’re so easy to knock down in a wave of new tech or a new update to the algorithm can just sweep in and take it out so fast.

Kyle: [00:15:34] The experience of a digital platform can change so rapidly, both for a user or for a creator. Suddenly your TikTok feed is full of buying opportunities. Suddenly, all the accounts that you were following are making commercials for their packing cubes or whatever. And I think it just speaks to how difficult it is to create a lasting relationship on these platforms, whether you’re a user trying to connect to a creator or brands trying to connect to consumers, it’s all shifting so much. It reminds me of the media industry too over the past decade. For years and years, the idea was we should just corral as large a swath of audience as we possibly can on the vast public Internet in BuzzFeed style. We don’t care who these people are. We just care that eyeballs are on our content. We don’t care where they’re reading it. It can go wherever. It just matters that the video is stamped with the Tasty brand or whatever. And that has utterly failed. Turns out you don’t want 10,000,000 randos just looking at your stuff. That’s not gonna make a sustainable business necessarily. So that’s the frictionless relationship where you don’t care who they are. You don’t care what they’re doing. You just care that they’re looking at your stuff. I feel like we’re leaning more into friction now in the media for sure. And I tend to think media practices flow elsewhere pretty quickly. So [00:17:13] it’s knowing who your customer is, and cultivating a longer-term relationship, and that requires a kind of friction or slowness or patience in a way. You don’t just want them to frictionlessly convert from a viewer to a buyer. You want them to actually think about something. [00:17:31]

Phillip: [00:17:32] We had Fred Reichheld, the creator of NPS, Brian, came to a dinner of ours in Boston a few months ago. And he was talking about how, in the modern age, this idea of asking someone what they thought about an experience that they had also, like, kind of forced them to think about it in a way that people may be used to. But in the current era where nobody is bored and everybody’s always preoccupied with something, most customers aren’t actually thinking, unless something bad happens, they’re not really thinking about what they felt about an experience or a product. They’re just seamlessly moving from one experience to the next. And a brand asking “What did you think of that?” actually forces them to make a decision. And in that way, it’s kind of its own Schrodinger’s cat almost of the decision point of whether someone really liked a product, service, brand, or an experience isn’t made until the brand asks about it. And so, therefore, it’s kind of an incomplete measure, because it’s highly contextual as to when you ask them as to how they actually felt. And this is a really interesting almost philosophical issue right now, which is I don’t think that a lot of people… Forget the fact that they don’t wanna incur friction, Kyle, but, I think a lot of people aren’t really all that, they wouldn’t be like you where they start to notice a lot of homogeneity in the world, in all of their experiences. I feel like it’s never actually even considered, that’s kind of thoughtless. Or is that too reductive? Do you think I’m being unfair?

Kyle: [00:19:12] No. I think I mean, I tend to not blame the user necessarily. I mean, I think we’re all shaped by the technological media that we have in our era.  [00:19:29]The digital platforms treat us as passive consumers of content and as fungible user eyeballs. And so that’s how we act. We act as these passive consumers who don’t think about what we’re consuming until we’re given a reason to, and that’s unfortunate. [00:19:51] I think it’s bad for us. It’s bad for us as users and consumers. It’s also not great for the social benefit of a company or of a product or whatever if all it exists to do is create and then commodify attention. That’s not a fundamentally valuable exchange. And the homogeneity, I mean, it is a difficult line to walk. In many ways, homogeneity can be good and effective. It’s nice when a nice hotel is homogeneous. You want a kind of reliable homogenized experience. You don’t want something quirky. You don’t wanna think too much ship at your hotel room. But it’s worse when everything is so homogenized. It’s worse when culture, as I write about in the book, is the subject of that homogenization. And I think I mean, culture is a product in some ways. It makes money. It gets attention, whatever. But we don’t judge or we shouldn’t judge the value of culture based on its efficacy as a product, in my opinion. I mean, you can if it’s a microwave. Sure. But a piece of music or a painting, its value is not just in how much engagement it gets or how much attention it can commodify.

Phillip: [00:21:24] But if we could put that on the blockchain…

Kyle: [00:21:27] {laughter} Well, there we go. Yeah. I mean…

Brian: [00:21:29] I definitely had, as I was finishing your book at 1 AM last night while eating ice cream…

Phillip: [00:21:35] Did you cram this book?

Brian: [00:21:36] Oh, yeah. And, you know, I ran out of scotch. I ran out of scotch, so I’d found the last drops of old brandy that I had in my cupboard. And I was, you know, working my way through it, and there were so many moments where I was like, “How did we have this simultaneous inspiration sort of moment?” Because so many things resonated so well for me while I was reading. Even the Emily and Paris example is something that we’ve covered in Future Commerce. And I was like, “Wait a minute. Did we get algo pilled to…” I had this meta moment where it’s like, “Oh my gosh. Maybe these thoughts about the algorithm that we’re having are actually inspired by the algo realizing what we’re interested in and just going further and further. We have this whole extra meta moment of Internet click hold, deep-dive stuff that we all got into.” I got kinda freaked out.

Kyle: [00:22:42] {laughter} No. I think that’s a good… The paranoia… If the book induces some kind of paranoia, I think that’s a good thing. If it suddenly strikes. If you are suddenly self aware, if it’s the zen monk having the epiphany of I don’t exist or whatever. I think that’s a nice outcome. But it’s something that I’ve said about my kind of stories or work in the past is I like to take something that you like or that you think you like and then tell you why you actually hate it. {laughter}

Phillip: [00:23:21] {laughter}

Kyle: [00:23:21] By the end of the piece, at the end of the piece, you should be like, “Oh, wow. I actually don’t like this thing anymore. I can see why my enjoyment of it has actually been destructive or something.”

Phillip: [00:23:35] Give me an example of that, from other writing. What’s one of the targets of that sort of fascination?

Brian: [00:23:41] Please tell me you haven’t attempted this with Sufjan because I noticed that you mentioned him.

Kyle: [00:23:46] Music is not my thing. I originally wrote an essay about Emily in Paris called Ambient Television. “Emily in Paris and The Rise of Ambient Television.” And that was kind of asking myself why I found this show both soothing and enjoyable, but also completely dystopian and empty at the same time. Many years ago, I wrote a profile of Kinfolk Magazine, which was the kind of minimalist hipster lifestyle bible of the early to mid 2010s. And I was kind of like, “Why is this so popular? Why does it exist? Let me think about what makes it tick in a way.” And by the end, once you deconstruct everything, you’re kind of like, “Oh, I don’t like this so much anymore.” Yeah. I mean, it’s not just about critiquing popular culture, I think. It’s not just that something is popular, but there are certain things that are insidious or that encourage you to not think about them. And then when you think about them more, you realize that they’re maybe not so good for you. What you were saying before, it does also makes me think of, in the experience of writing this book, I often had the feeling that I was just writing about all of our experiences. I didn’t think it was just me, or I hoped that it was not just me. I had a sense of as I was writing, there was an urgency to these feelings about the Internet to describing the experiences that we’ve had in the past decade. And I’m glad that people have recognized themselves in it. Because this book is not about me personally. It’s not like a memoir or super intense essay. It’s an act of thinking about what we’ve gone through as the Internet public in the past decade. So I’m always happy when people are like, “Wow. This really told me something about what I experienced online.”

Phillip: [00:27:18] There’s a really important part of this too that I’d like to get into with you where you talk about the algorithm being a source of behavior and almost labor-inducing force for people who work on digital platforms that are driven by algorithms. A couple good examples… I think one you gave was an Airbnb host having to sort of respond to the changes in the algorithm and having to keep up with its changes. I think you liken some of these changes as people that have algorithm anxiety. So they sort of live with a latent sense of fear around whether the algorithm change will cause them to have to either work harder or incur some work, but maybe even potentially that their livelihood is at the mercy of the algorithm. Let’s talk a little bit about how you came to understand that and how you see that playing out right now.

Kyle: [00:28:46] Yeah. I think algorithmic anxiety is very real. {laughter} It’s I mean, I think on a personal level, we can recognize it because we have algorithmic anxiety when either the recommendations are inaccurate, like our Netflix homepage doesn’t know what we actually like, or if they’re too accurate, like TikTok somehow knows that you definitely wanna see this ramen cooking video or something. Something that’s so specific that you don’t know how it got to you. I think it’s particularly acute for people who make money through the Internet. And that’s, I mean, so many people now from an Uber driver to an Airbnb host to an artist who sells prints on Instagram. Those livelihoods are always mediated by your access to your audience or to the user base of a given marketplace. And so you’re constantly combating the search algorithm, the recommendations algorithm, the pricing algorithm in the case of Uber. And that’s such an anxiety-inducing position to be in because you have no control over that. You have no… There’s no way to talk back to those systems. There’s no way to change how they operate. They’re mostly opaque. We don’t know what the TikTok algorithm is. We don’t know what terms or variables Airbnb is taking into account on a given month. I think if anything it’s hard to overstate. It’s hard to overstate just how much algorithmic recommendations mediate people’s ability to make money from their content, which is in a very broad sense, music, photography, art, you know, kind of anything on the Internet.

Phillip: [00:30:42] Whereas maybe art in a prior generation was I would say memorialized icons of deity or religious experience or spirituality. Today, it’s very modern for you to sort of deify these other things that are beyond our control. I’d written a piece in 2021, called The Idolatry of the Algorithm. That was more about sort of the deification of the algorithm and how our language is sort of adopting around this thing that’s outside of our control, and we’re just sort of at its whims. And I keep going back to this 2018 tweet as part of that piece about a Lyft driver saying, “I’ve been blessed by the algorithm today.” I was blessed by the algorithm. I was blessed to have met you because of the algorithm.

Kyle: [00:31:35] {laughter} Yeah.

Phillip: [00:31:35] And it almost kinda brings on this idea of a predetermined path that we’re all trying to be forced down to and that we are ultimately relenting to. In the cleanse, you’re sort of opting out from that. So I’m curious if there’s a larger movement of people who then begin to opt out and if they act as sort of a counterculture to some degree. Is that something that we can anticipate?

Kyle: [00:32:02] I think so. I mean, I think our relationship to the algorithm has “grown and maybe peaked.” We saw over the 2010s how we all came to and or think we understand what an algorithm is. And all of a sudden, we ascribe it so much power and authority over our lives, like the driver of being blessed by the algorithm or if you have a lot of successful tweets one day you’re like, “Oh, the algorithm loves me today.” I’m really succeeding. And so I think we’ve seen… I think that marks the kind of high water point of our relationship to the algorithm. If anything is exerting that much influence over our lives that we’re like, “Ah, yes, I’ve been blessed by the algorithm,” it’s like saying you’re feeling lucky today or something. So I think there is a backlash to it. Particularly as people’s experiences of algorithmic feeds have become worse in the past few years. I think just because the feeds have worked in worse ways, like Twitter/X is only the worst example. And so I think [00:33:25] we’re seeing another wave of Internet development happening with smaller platforms that are not so algorithmically driven. I think user behavior is changing, albeit slowly. [00:33:38] No one’s giving up the TikTok For You feed necessarily. And that algorithmic feed, the holy algorithmic feed, is still the kind of icon of the Internet experience right now. But, particularly as I’ve had conversations around this book and did book talks and stuff, everyone is expressing their boredom. Everyone is really thirsty for an alternative. But I think we haven’t… The backlash has not been long enough or maybe just had enough resources to generate actual large enough alternatives yet.

Brian: [00:34:21] Yeah, I think that’s probably true. Yeah. I think that sort of semi-Luddite approach to the Internet is definitely taking hold right now.

Phillip: [00:34:31] {laughter} I’m a semi-Lud.

Brian: [00:34:32] A semi-Lud. Yeah. I am a semi-Lud, actually. We’ve talked about this, Phillip and I, living sort of beneath or above the algorithm and how you let it dictate your life. I think that’s like you said, this is sort of starting to take hold as people become aware of it. It is such a black box though, and, actually, that term, you brought up some great thinkers in the book: Marshall McLuhan, Walter Benjamin, and some other incredible media theorists. One that’s not in there, but you do sort of refer to him through the term black box is Norbert Wiener, and he talks about, he invented that term, and he gets into how humans and machines are gonna interact in the future, back in the fifties. And I think he does a really good job of sort of predicting that we’re gonna outsource all of our low-level decisions to machines and algorithms effectively.

Kyle: [00:35:36] Turns out he was right.

Brian: [00:35:38] Yeah. Turns out he was right. Exactly. But the idea that we can’t really know what’s going on inside of the computer, actually, I think is key to us sort of trusting it because we have to. I think that’s…

Phillip: [00:35:52] Do we?

Brian: [00:35:52] We’re part of the magic to it.

Kyle: [00:35:55] Yeah. I mean, one thing that struck me repeatedly while writing the book was just how opaque the systems are. That black box idea is kind of a cliche. We say it about machine learning. We say it about algorithmic recommendations. I mean, it might be a black box to us, but there are variables that are chosen by Silicon Valley engineers that prioritize a length of a video or a metric of attention or whatever. And it’s certain regulations in the EU that are I think now we’re pushing for more transparency and against that idea of a black box so that we would have more interactivity with the feed or with the recommendations, that we could opt out of the recommendations or we could change their frequency or just have this way of talking back to the system. And I think that would be healthier obviously, but also humans are really passive and it’s like out of the huge body of users, maybe you’ll have a 5 to 10% of prosumer level people who will tinker with how their feed works or something. But most people are just gonna go with the default of however something is built to work.

Brian: [00:37:19] Definitely agree. Yeah. Even for those prosumers, or maybe even more importantly, the businesses that are chasing after the algo, it feels like it all changes so quickly. It’s really hard to get your hands around it, even if it does change, and it’s transparent with this change, the ultimate impact of all the variables when they all add up gets really hard. It’s a moving target and the outcome is really hard to predict when all the variables come together.

Kyle: [00:37:56] Yeah. Yeah. I mean, it makes me think a bit. I write about kind of the 90s/2000s era of the Internet and of computers in the book. And lately, I mean, I’ve been very struck by the shift from software that you download onto your computer that runs in a certain way to everything being cloud-based, everything updating on its own, changing how it works on a daily or weekly basis. And that’s how the Internet works now. We can’t save how Twitter worked five years ago and still run that version of Twitter.

Brian: [00:38:37] Totally.

Kyle: [00:38:37] I wish I could, but we are no longer even in control to that degree of how our software works. We can’t choose to not go for an update or something.

Brian: [00:38:50] Exactly.

Kyle: [00:38:51] And that’s a little, you know, it removes your own agency. It removes your ability to possess those tools that you’re using. And instead, it just gives much more authority to the company that’s running them and the designers who decide how it works on a real time basis.

Brian: [00:39:12] A hundred percent.

Phillip: [00:39:13] And to your point, there was an era of transition where they did allow you to opt in to the new feed for some time. I remember the big algorithmic timeline change on Facebook, was sort of an opt in feature, was like this big fanfare that was like, “Oh, we’re gonna move to a news feed.” Right? And you could opt into it early, and I was trying to stay out of it as long as possible. I was always the last person hanging on to whatever was left. That’s not how the world works anymore. I don’t think that they’re so gracious with the change of the updates. It also makes me [00:39:53]… I also grew up in AIM-era AOL, chat rooms, and those aesthetics are still captured somewhere on the Internet, and they’re memorable because they stuck around long enough to make an impression on us. I don’t know that anyone pines for the 2019 brief interface change on Instagram as it was. There is no era anymore because it’s constantly in motion. And I have to wonder if every medium is memorable because of its aesthetic, what does it say when the aesthetic just continues to change and shift and evolve? What happens to our memories of those things, especially as to how we relate to each other? [00:40:39] And it’s unanswerable, but that’s kind of our shtick here.

Kyle: [00:40:45] It’s a bleak question. No. It’s like, how do we remember anything if it’s all changing so quickly and it’s also ungraspable? I think a lot about the kind of generation of Internet artists or “Internet artists” who emerged in the late 2000s and early 2010s. A lot of it was in Brooklyn where I was living at the time. And these were artists who were doing their projects on Tumblr or on YouTube. And now fast forward 10 years, and those platforms do not look or work the way that they used to. The context has been removed from these artworks. And so museums, which are all about preserving context and showing a work of art within its original format, have gone back and emulated 2009 YouTube or whatever. They have to restore that interface. They have to bring back the old version of that thing or they have to emulate the Tumblr scheme from 2010. And that’s really interesting to me because that creates the meaning of the artwork. You can’t really experience that artwork unless you are experiencing it in that old Internet format because that’s how it originally worked. It’s an old piece of video art that’s meant to be shown on a cathode ray TV

Brian: [00:42:23] Right.

Kyle: [00:42:24] Tube TV or something. It’s just part of the inherent quality of that thing, but we lose that so much with the rest of our experiences online that aren’t necessarily artwork, but are still cultural. We experienced a particular song on YouTube via a recommendation maybe in 2017, and there’s no way of recapturing that experience. It’s not like you can put the final record back on the player and experience it again. It’s just completely gone.

Phillip: [00:42:55] I’d love to kinda get into this idea because I think we’re on that track. Section 230 is a relevant conversation right now. You know, there are a lot of policymakers who seem pretty hell-bent, it seems, on changing the nature of how platforms moderate content. Very prescient for this book to exist at this time, especially with you having a whole section dedicated to the role that platforms play in moderation, content moderation, and how they shape it. But don’t algorithms already do that? Or is there a human nature, human quality to it that isn’t really curation, it’s moderation? But isn’t that already something that we’re involved in, and how would a change to Section 230 change what we’re talking about right now?

Kyle: [00:43:48] Yeah. I mean, Section 230, it’s like the Bible. Everyone interprets it in their own way and has their own spin on it. In terms of content moderation, moderation is something that most platforms do already. Facebook tries to be a decent content moderator. We can argue to what extent. YouTube tries to keep terrible things off of its platform. But reforming Section 230 would just make it a mandate rather than something to be polite about. If Facebook is legally liable for things that exist on its platform, then it will prevent things like misinformation or libel or revenge porn or whatever from being on its platform. Because if it is on the platform, they will get sued and their company will be destroyed. The forcing of that responsibility through the law, I think can be a good thing. It’s tough. You have to strike a balance because the utility of social networks as we know them is that anyone can post anything more or less, and reach the audience of whoever wants to see that thing. So in a way, that’s why we think of social media as a neutral format, and that’s why it’s protected in Section 230. But I think TikTok and Instagram and Facebook have increasingly taken on the qualities of something more like cable television, where they do really choose what to prioritize and what to promote. They’re making a lot of editorial decisions about what’s on their platform. And so maybe they should be responsible for it. And if mandating that they’re responsible for it removes 40% of content from the Internet, then maybe that’s fine. You can still email whoever. Email is an actually neutral format. I haven’t researched this, but email is just like an open system of communication, whereas Facebook is exerting a lot more influence over what flows through it and how. So I mean, personally, I’m a fan of more legal responsibility for platforms for their content.

Phillip: [00:46:27] Yeah. Gathered that from the book. I think the question being, when platforms that have been places where culture happens, like Tumblr, have to become commercially viable to exist in the future, then they become censored to a degree to where the content within it is no longer deemed useful or valuable to the commercial aspirations of such a platform, and then it’s gone forever. And that, I think, is the commerce to some degree kind of already acts as this content moderation lever because after a certain point of time, that thing can no longer exist in its current form without becoming more commercially viable. And so that’s where I’d like, the anti 230 argument in me would be capitalism kinda just solves itself eventually, and these darker things maybe go away. And maybe other stuff is innocent bystander. You know, it’s an unfortunate reality that that’s gonna go to. Things we like have to go along with things we don’t.

Kyle: [00:47:31] It’s funny to think… I do think capitalism can solve it. It should be a market-produced solution, but the market in a technology sense has been so skewed in a million different ways that competition is not happening. I mean, when you think about the flood of VC money into social platforms in the 2010s, Tumblr never had a market solution because it was subsidized by a bunch of venture capital and then by its purchase by, what, Yahoo? For a $1,000,000,000 at whatever point. And then the curtain fell and everyone was like, “Oh, wait. This shouldn’t even exist. This does not sustain itself. It has no reason to exist on its own.” And so I think the floods of VC money subsidize certain products and behaviors that are not just in the interest of a market or of users. It’s a market dictated by investors rather than PnL or whatever. And the technology landscape has become so monopolized by Meta and Google, that even new and interesting solutions that might slowly develop and grow into different experiences are acquired and stamped out as fast as possible. I mean, the fact that Meta owns WhatsApp and Instagram and Facebook, what competition, what capitalist competition can emerge from elsewhere? It’s a challenge that seems nigh impossible.

Brian: [00:49:22] Features get copied immediately, and then if they’re already available on the platform, you’ve already done well, and why would you ever go migrate to a new platform? Of course, there’s opportunity for net new creators to come up on new platforms, and we can debate markets versus regulation, I think, all day long. And I think there’s probably good cases on both sides, but I do wanna come back to something and take you a little bit of a different direction for just a second here because you said 40% of content should go away, and maybe that’s fine. I do think there’s something huge here…

Phillip: [00:50:04] 40% of Future Commerce’s content could definitely go away, and no one would notice.

Brian: [00:50:12] Matt Klein over at Zine does his meta trends report every year, and he reports on the trends of the year and aggregates them and provides commentary on them. And one of the things he noticed is that all of the trends of the last five years kind of all fall in the same bucket, and it’s just sort of you can’t even determine which trend that’s being predicted was in whatever year. I think this is sort of consistent with the flattening of all culture and experiences. One of the things that I was thinking about is algorithms are always gonna take us back to what’s what’s engaged, what’s the most engaged. It’s gonna go back to the mean. And we have so much content now that it would take one person’s lifetime and many, many more to get through everything that’s already out there. And I’m not talking about just all the bad stuff. I’m talking about just the good stuff. The stuff that’s actually worth engaging with is worth many lifetimes worth of engagement and study. And so what I’m kind of driving at here is, are we now at a point where creativity and new aesthetic is being hampered not just because the algo is feeding us the stuff that we engage with, but because there’s so much content out there already that the algo never has to touch anything new. There’s already something out there that’s good for us to engage with, and we don’t need anything else. We’ve already hit peak content.

Kyle: [00:51:57] Yeah. The Library of Alexandria of everything already exists.

Brian: [00:52:02] Yeah. Already exists.

Kyle: [00:52:02] And so we can dredge up the 100 other people that have already answered this question. I think that’s a very interesting hypothesis. We’ve seen it with Spotify where a bunch of studies have found that Spotify is recommending older music more than newer music. There’s more listening to more older music than there is new listening to newer music, which is such a strange phenomenon because it deprioritizes the production of new music.

Brian: [00:52:34] Right.

Kyle: [00:52:35] I mean, the whole thing makes me think of you kind of have to forget everything that exists out there [00:52:41] already. You kind of have to ignore that someone else has already thought about the problem that you’ve thought about or come up with a good book on whatever. You have to have this willful amnesia to make something new.  [00:52:55]Because otherwise, you’re just gonna get dragged down into the fear that someone has already done it better or you will never add anything else. You have to just do it anyway. That’s what every artist or writer or poet has to contend with.

Brian: [00:53:09] Totally. No. I’m feeling that right now, actually. Actively.

Phillip: [00:53:14] He literally is going through this. He’s having his own crisis about it. It’s a cancer in my mind. Okay. That’s a charged term. Let’s strike that one. Sorry. I think it’s a big mistake and a very modern one to believe that the thing that you’re creating has to be for other people’s consumption. Although, I know that that’s the premise of us having a media business. But at least Rick Rubin, the one thing he’s ever said that I kind of agreed with is you kinda have to do it for yourself. And it’s the process of you getting it out of you that is the important piece. And then maybe one day someone else cares about it, but that’s not very commercially viable…

Brian: [00:53:59] But media creation is all sell out now. It’s a 100% sell out. Everyone that creates something is doing it to make money.

Phillip: [00:54:06] Everything comes to an end, Brian. Everything. There’s nothing that’s permanent. I think that that one day will end too.

Kyle: [00:54:12] They might. It’s a certain era that we are living through and have lived through. The metric of success is the attention that you get on the Internet, and that’s why every artist is also an influencer. Your followers and your audience numbers are what dictates your success. And I think we have lost that sense of just making something for yourself and creating something that exists because you want it to, not because it will get a 1,000 people to hit the like button. I think that that idea of art without an audience or creativity without an audience has been lost in the total visibility of the Internet, and that’s a shame. I mean, it’s on one level, it’s sad when you make art and no one cares about it or ever sees it. On another level, what other reason is there? You don’t make art so that other people like you. At least I don’t think that’s why we do that. Or good art, at least, is not just driven to get engagement and likes. It’s driven to say something to express some ineffable personal quality of your life.

Brian: [00:55:25] Or is it? We look back at how art’s created over history, and a lot of it has to do with monetary situations. And the best art that’s been sustained throughout history, often has some sort of commission component to it or whatever.

Phillip: [00:55:44] Patronage. You’re talking to Kyle. I think Kyle has a a pretty firm grasp on…

Brian: [00:55:49] We’ve democratized patronage.

Kyle: [00:55:53] No. That’s I mean, democratizing patronage is a good thing, I think. I mean, now it’s like you can create whatever, for your audience of a 1000 people if you found that audience, and that’s very nice. But, I mean, for me, I am a modernist fundamentally. And the modernist conception of art is that you are expressing something ineffably specific to your worldview or whatever. And the financialization of that is a different quality. But, I mean, you could think about it in music too. Why do you make music? You are making it to reach people to communicate a feeling, to build a community, whatever. The idea of making music solely to get more streams on Spotify is what results in lofi, chill, hip hop beats music. If all you wanna do is have as many people as possible ambiently paying attention to your YouTube stream, then that result is art that means nothing.

Brian: [00:57:07] Totally. Yeah. I know. And that we’re very much aligned on.

Phillip: [00:57:10] Yeah. Well, which is what I think comes back down to what we’ve done with consumer brands. So we’ve seen this over, you know, it seems like almost algorithmically generated consumer brands because of the distribution method of where you find those brands. You’re not going to a specialty store. You’re not going to a Curio, even the bodega, I think. There’s a friend of the pod, Emily Sundberg, who wrote the sort of Shoppy Shop aesthetic piece. Eeverything has really become homogeneous even without, I would say, algorithm. I think it’s just maybe there is a consumer dynamic around chasing what’s popular and trying to figure out what’s popular because it’s deemed to be some sort of commercial success even though, I think, we figured out that the business models necessarily weren’t commercially successful, but maybe under a zero interest rate policy aberration. But, yeah, this mimetic behavior of copying what other people are doing is actually fundamentally human. We just now have a global scale in a way that we never had before. Is that the algorithm or is that just human behavior being accelerated?

Kyle: [00:58:20] Yeah. It’s something I read up with inthe book is it’s not just algorithms per se. It’s not just the existence of recommendations. It’s the existence of these globalized digital platforms, period, which are the things that connect so many people and allow them to do that mimetic copying of behavior and decision making. We’ve never had a real time exchange of digital media between a billion people around the world, but now we do. And solely the existence of that, I think, is something that creates homogeneity in the same way that, I quote this French philosopher from the 19th century in the book, and he is complaining that trains have ruined Europe because people are traveling too quickly between European cities unless the cities are getting more homogeneous. It’s like [00:59:14] the sheer ability of people to move quickly and change ideas and information is going to create that homogeneity. It’s just that algorithmic recommendations and feeds make the speed of that exchange even faster, even more granular. [00:59:29]

Brian: [00:59:30] Yeah. Marsha McLuhan says all information watches over us continuously because of electronic communication. And that gets back to what we were talking about just a minute ago about how all content is available at all times. There are billions of people that are creating, and all of that’s available all over the place at once. And and so how then are we supposed to… I think it’s discouraging net new creation. I think that’s ultimately I think where we’re headed here is the net new is going to continue to be a challenge to highlight, promote, share. And I think you kind of touched on this at the end. The only way that we’re gonna be able to do this effectively in the future is through tastemakers, curators, people that have the ability to recognize something as good and bring it to other people.

Kyle: [01:00:31] But we can all be tastemakers.

Brian: [01:00:34] Yes. We’re all tastemakers now. Yeah.

Phillip: [01:00:36] Thank you, Kyle.

Kyle: [01:00:39] To me, there’s so much negative connotation and perceptions of elitism to the idea of taste, to the idea of gatekeeping, to the idea of tastemakers. But I think we have to get rid of that and just remember you can find a piece of music and recommend it to your friends, and that means you’re a tastemaker. You are spreading your enjoyment of something to other people.

Brian: [01:01:07] Bring back the cannon. I’m in. Let’s go.

Kyle: [01:01:08] Yeah. Like, I mean, we should all participate in it and it to me, it’s like that tyranny of visibility is the bigger part of it where not everything has to go to everyone. A band can be popular in one city without being popular in all cities.  You can like a restaurant on your block because it’s on your block and because it’s specific to itself without it needing to become a meme on Instagram. So I think, taking away that elitism of tastemaking and instead just being like, “You should appreciate what’s around you and participate in the cultural ecosystem that’s around you.”

Brian: [01:01:51] This is super funny. My son just wrote an essay. He’s 12. About this very thing.

Phillip: [01:01:58] Of course, your son wrote that essay. That makes so much sense. Can I actually, I’d like to call out the horseshoe nature of the things we’re talking about because I think it’s ideologically aligned with anti globalism? On one small scale is to say preserving culture is a slippery slope of an idea that eventually has a very, very well aligned with this idea that anti immigration policies have a very similar kind of an idea behind it where it’s like we have to preserve our culture. And maybe in that way, I know a lot of people have made fun of the network state in Balaji Srinivasan and some of these ideas of the way that we sort of ideologically aligned and that there’s something more meta or more abstract than nation states, that there’s thought based nation states that have formed in the age of the Internet. I feel like there’s a horseshoe effect here and there is a point of agreement to say that some things are worth preserving, and in this case, it’s maybe personal tastemaking is the solution and not necessarily saying that we don’t need global networks. But I love your reaction to that to say that maybe there’s something that we all have in common here, which is we feel like the sense of loss is occurring, or maybe it’s not lost at all. What are your thoughts on that, Kyle?

Kyle: [01:03:24] Yeah. I always like this, Rem Koolhaas quote, he’s a Dutch architect and a very influential postmodern theorist, basically. And he has this line in his essay, The Generic City, where he says identity is a trap. And I just always think about that. The full line is something like identity is like a mouse trap where the bait gets less and less each time or something. It’s an amazing line. But I think preservation is the wrong word. It’s not about preserving a local identity. It’s not about preserving some kind of traditional notion or something, but instead, [01:04:14] it’s about connecting with what’s around you, connecting with people who are in line with your philosophy or whatever. We can build communities without everything having to be for everyone, maybe. [01:04:32] So in that way, I like the idea of network states. I like the idea of magazines as communities. I think this is a a kind of old notion that we can connect with other people who think like us and build communities and build in that way. But it is a really hard question of how do we move toward this better future, a better conception of culture. And what comes to my mind is just one thing that has led to the state that we’re in is the cafes in Brooklyn are paying more attention to the cafes in Australia than they are to what’s happening in New Jersey. There’s a temptation to just look at the broadest possible scale at all times on the Internet because that’s doable. And that’s super interesting because you can reach anyone across the world and connect with anyone. But I think we are seeing a move back toward okay, what’s around me? Who is doing stuff in my neighborhood? What are the kinds of specific cultures that are around me that I can engage with. And what makes this place different from the Australian cafe, not more like it.

Brian: [01:06:00] Right. Yeah.

Phillip: [01:06:01] More pictures of Bruce Springsteen, that’s what makes them…

Kyle: [01:06:05] {laughter} I mean, it is the specificity. The generic minimalist Australian cafe is like an amazing solution to a specific problem.

Phillip: [01:06:17] They’re truly their greatest cultural export besides Outback Steakhouse.

Kyle: [01:06:21] It really is amazing. Avocado toast is the perfect globalized dish of the 2010s. Yeah. And specificity, I don’t know. I like that word specificity, but it’s hard to achieve specificity. Achieving and then maintaining specificity seems very, very difficult.

Brian: [01:06:44] It feels like everything we’ve talked about sort of goes right in the face of specificity because we’re losing accents all over the world where people are actively shedding them. And so when you talk about specificity, usually, actually, language is a great indication and the way that people speak languages is a great indicator of how specific or general things are. And so back to your train example, cities are losing their character. Nation states are losing their character now because of the Internet. And that digital front porch and finding people who are intellectually or aesthetically aligned to things that you enjoy versus the physical front porch of saying hi to people that are on your street. I’m not sure which one results in more diversity of thought, because it’s hard if you error too far on one side or the other. It feels like you’re just going down a rabbit hole.

Phillip: [01:08:00] Well, cat’s out of the bag. Right?

Kyle: [01:08:02] You need to do both of those. Digital front porch… Is that an established phrase? That’s very good.

Brian: [01:08:10] No. I just made that up.

Kyle: [01:08:11] You should run with that one. That one’s very good. But this is what makes me optimistic about a Substack newsletter ecosystem versus I mean, the Substack ecosystem is the opposite of the BuzzFeed era of all content for all people. It’s like, “No. I have my digital front porch, which is my newsletter, and I see the people who pass by who engage with my thing, and they like it and we wave to each other,” and whatever. And through that, you’re slowly building a conversation that’s more different than the rubber bands on the watermelon video. {laughter} If the end point of the BuzzFeed era was the blue dress, the blue and white dress, or the watermelon exploding because of rubber bands, okay, we’ve seen that. That’s all content for all people. And now we have this much more decentralized smaller scale exchange of writing or videos or audio or whatever that’s happening through all of these smaller points of entry. And that makes me optimistic, but it is much more opaque and inaccessible. So if I want to get into a new thing, it’s harder for me to make a headway into a particular topic or community or whatever because you have to work for it.

Brian: [01:09:36] Just to add one last final thought to that. I think when it comes to things that have actually impacted me for real, like recommendations the algo has made versus people that I trust or care about or that I thought have good taste, I most always, almost always, the recommendations from people that I trust or respect are things that have stuck with me for a longer periods of time, unless it’s a silly video or something that’s really catchy or approachable that’s just really generic and generic to humans. Everybody loves looking at silly cat videos or silly dog videos. Those make me laugh. I find that those types of things are the only thing that sticks with me from the algo, and pretty much everything else has faded away. All my music tastes and all my best movie tastes and all of that haven’t come from Netflix recommendations. They’ve come from people that have said, “You need to watch this.”

Kyle: [01:10:37] Yeah.

Phillip: [01:10:38] But, Brian, you’re like a king of…

Brian: [01:10:40] A semi-Lud. Yes.

Phillip: [01:10:42] You’re the guy, though. This is the one thing that I feel like is we’ll think more in-depth when you’re not having to listen to us opine about it, Kyle. But I’m kind of… You have a very specific personal taste, Brian. And it’s not really negotiable for you. I feel like mine changes over time. I feel like I’m always kinda reinventing, but you have been the Costco loving Pendleton sweater wearing, pipe smoking guy as long as I’ve ever known you. And the issue is what happens when the culture comes for the things you love and now you’re the meme?

Brian: [01:11:18] I’ve always been the meme. The memes just change. For a while, people made fun of it, and then people thought it was cool. It’s gonna keep cycling.

Phillip: [01:11:25] Think the algorithm deciding that Costco is important or popular or what people chase now kind of undermines your uniqueness. And that’s where it becomes this weird no matter what you do, you have to be aware of it. Kyle, we had couple questions from the audience. I think the resonant theme here was in your thoughts I mean, you’ve really written about culture being impacted, but do you see our take is that commerce is a unique expression of culture in the same way that food is a unique expression of culture. Do you see ways that commerce is being impacted by Filter World?

Kyle: [01:12:02] Yeah. I mean, so much of what we buy seems to be influenced by algorithmic recommendations too. I mean, I’m very guilty of that whether it’s trying to find dog treats on Amazon or going on bookshop.org or whatever to buy books. I feel like all of our purchase decisions are influenced by these things. And commerce was kind of an early area where these technologies developed. The Amazon homepage has been algorithmic and has had recommendations for decades at this point. So I think it has flattened those experiences in the same way and it’s led to phenomena like the Amazon one checked rug that everyone buys for one summer and then it’s over. So I think it has sped up trend cycles. Our digital ecosystem has sped up trend cycles and made things more universal more quickly. So it’s suddenly everyone has that rug. Everyone has that jacket.

Phillip: [01:13:15] And that’s where the Amazon effect being that every consumer now demands the same type of experience everywhere they go, from shipping to the means of navigation. We’ve talked about the Shopify effect, which is every single website looks and behaves functionally identical to the next. And that’s its own class of its own homogeneity, which is something we’ve railed on since 2016 is we’re all kind of tending towards this average, and it’s not hard to stand out if you put any effort into it whatsoever. But very few are willing to at this point because it creates friction. And we don’t want friction. We wanna grow our businesses. So it’s very much akin to the Filter World thesis here.

Kyle: [01:14:02] I feel like there is this question of how do you have the great boutique store online. And how do you build an experience that’s meaningful and a curation that’s meaningful? And I feel like we have that at one point with Tumblr or various Tumblr era things, but now we’ve kind of left it behind once more for the most frictionless solutions possible. Though I see things like, who’s the founder of the Wing now has, like, a TUI eCommerce Shop. Not 10 bells, something like that.

Phillip: [01:14:41] Hold on. I forget. She did. There was an announcement about this, but I’ve not really paid attention to it.

Kyle: [01:14:49] Oh, yeah. She has, like, a TUI HomeGoods storefront in Brooklyn, that’s also a huge ecommerce operation. And that’s scaled up curation in a way. It’s not just the storefronts in Brooklyn. It’s just like you can participate in the brand and the aesthetic curation from wherever you are.

Brian: [01:15:11] And I think that gets back to the personal touch on things, like the influencer strategy. Is this the new influencer? It’s the retail store curator, content curator, polymath curator that has something in Brooklyn and has an online store, but also has a really cool media feed that they in recent recommendations they put out. And there are people I feel like Daisy from DIRT. She’s incredible at being that kind of a curator. I’ve seen some of her stuff or her visions for what DIRT can be, and it sounds actually not that far off from what we’re talking about.

Kyle: [01:15:54] No. I think things are trying to go that direction. And, I mean, it’s interesting. I think for certain influencers, the fact that there is this kind of pressure to do everything or ability to do everything can be good for them. I’m thinking of this illustrator/fashion influencer, Jenny Walton, who writes a lovely Substack with thoughts on vintage fashion. And she does illustration commissions for fashion brands. And her Instagram is really delightful. She is someone whose personal taste is so strong and her ability to broadcast it is so strong that I’m like, “Do everything. Tell me what toothbrush to buy, and I’ll buy it.” I will go to your boutique. I will shop your life.

Brian: [01:16:49] And one of the things we talked about at Future Commerce in the past as well is sort of like how algorithms can be used to extend those people’s viewpoints. So perhaps that’s part of the future of algorithms. It’s not some massive amalgamation of all of our tastes, but of specific people’s tastes and then how those might interact with things that you’ve expressed that you’ve liked in the past. And Phillip’s talked about this idea of renting algorithms or bring your own algorithm. These are opportunities, I think, for algorithms to take the next step in actually assisting us, because they did for a while, but it’s just all coming back to the mean. Maybe there’s hope for the algorithm to actually be useful again in the future.

Kyle: [01:17:36] Yeah. I mean, that’s a much more interesting idea. I think that the danger of mashing all of our tastes together has become really apparent. Or the collating the data of a billion users to decide what I am going to look at next has really not turned out super well, in my opinion, and it’s particularly not good when we’re applying that to commerce or fashion or music or art? I don’t think that strategy has been successful.

Phillip: [01:18:08] Well, I implore people to make their own decision based on your book. I think reading your book is so enlightening. I think it ave me 15 other pieces of art and literature to have to consume. So in some way, it’s its own recommendation.

Kyle: [01:18:26] Yeah.

Phillip: [01:18:27] Brian, one mention of Sufjan, Brian was like, “I’m in.” I’m like, “I feel vindicated that Kyle spent an inordinate amount of time on ants marching,” because I also was a Dave Matthews fan, aficionado at some point. But I can’t recommend the book enough. If anything, just having a one piece of writing that accurately describes how we got to where we are right now with this particular moment in culture is so important to me. Can’t wait to give it a second read, but I know that you’ll enjoy it on your first. So Future Commerce listeners, go check it out. It’s called Filter World, by Kyle Chayka. And I’m assuming, do you prefer that they not buy it through an Amazon algorithm, Kyle? {laughter} What’s the best place?

Kyle: [01:19:13] Well, we all have to work through all the algorithms. I do not care where people buy it. I do think they should rate it five stars on Amazon and leave many positive reviews even if they have not read it. Just go on there and game the system for me. You know, it’s all good.

Phillip: [01:19:33] I love that. We can change it from the inside. That’s what we can do. I love that. Well, thank you so much, Kyle. Thanks for coming on, and thank you all for listening to Future Commerce. You can find more episodes of this podcast and other Future Commerce properties at FutureCommerce.com. And you can get an ad free version of this episode and all of our podcasts, by joining Future Commerce Plus. One low monthly price gets you a discount on prints and merch and exclusive invites to our next events. You can get all of that in one place, FutureCommerce.com/Plus. Hey, commerce is culture, and we appreciate you spending some time with us. Thank you for listening.

Retail Online Training