Hey there! It's good to be back on the blog.
Over the past few months, I've been focused on setting up the foundations for A New Social. I couldn't have imagined this is where I'd end up after writing my Bridges & The Last Network Effect post, but here we are! It's been a wild ride, but I'm energized about the work we've set out to do, and we're ready to share a lot of exciting things over the next few months.
That said, now that the organization is established and we're in a proper release cadence, I'm looking forward to carving out more time for this space. I have essays sitting in drafts that are nearly done, and I want to get back to sharing what I'm reading, watching, and listening to regularly.
Since we're back after a while, let's start with a vibe check

Vibe Check
One thing that's been preoccupying me lately is the growing trend of "vibe coding". For the uninitiated, vibe coding is a method of writing software primarily through prompting large language model (LLM) chatbots.
This is a helpful way to quickly build tools for yourself or prototype an idea to see if it's worth developing further. I've attempted to use it in the past, but I've slowly been drawn away from it because it often requires heavy re-engineering to extend things beyond an initial concept. It's also frequently plagued by bugs and security issues, and if I have to rewrite major components, I'd rather do it myself from the get-go.
It's a helpful tool for curious, hands-on learners who want to jump-start using vibes and then dig deeper to understand the mechanics of how things are built. I also think it's great if you want to quickly script a tedious task to save yourself a few hours. You know, stuff for yourself.
Where I begin to get nervous is when it's used for production code that serves real users, especially when sensitive data is involved.
Semafor's Reed Albergotti reported that apps developed via the vibe coding tool Lovable were found to have, er, significant security issues:
The employee at AI coding assistant company Replit who wrote the report, reviewed by Semafor, says he and a colleague scanned 1,645 Lovable-created web apps that were featured on the company’s site. Of those, 170 allowed anyone to access information about the site’s users, including names, email addresses, financial information and secret API keys for AI services that would allow would-be hackers to run up charges billed to Lovable’s customers.
Yeesh. And these were just the ones featured on the company's site, something you would expect to be a bit more curated.
I want it to be easier for more folks to write code and understand how the software they use every day works. But software isn't just writing code. It's about learning good architecture: how data is accessed, how it flows, and, most importantly, who can access it and when. You can't just vibe your way through personal information and payment systems, because an issue there isn't just a bug; it has real-life consequences for average users who are entrusting you with their information.
The tech industry attempted "Move Fast, Break Things," and it led us down some dark paths, to say the least. Vibe coding instills the idea that we need to move even faster, because shipping means success. But when things break and you don't understand the software you shipped, things get ugly real fast.
If this trend continues at its current pace, especially with the potential use of vibe coding by governments, we may face a significant national security risk over the next decade.
We don't need more people writing software; we need more people who understand it. The vibes are insecure.
What I'm Into
- Mandy Brown's analysis on Artificial "Intelligence" as an ideological construct rather than a technology, and the power structures it enables and entrenches, both in name and function.
- Friend of the blog Erin Kissane and a team of brilliant folks, including Mandy Brown, are building Unbreaking: an online resource to "help orient and ground our communities in clear and rigorously cited explanations of what’s happening to our government and why it matters."
- Indie gaming studio TwigBit's "Sky Museum", a game where you sign in with your Bluesky account and roam a 3D gallery of art being shared in different feeds on the platform. What a unique way to experience the open social web!
- After gaming news site Polygon got bought out by content slop shop Valnet from Vox Media, some of its former guide writers are launching a new publication called BigFriendly.Guide.
- Nilay Patel of The Verge interviews Google CEO Sundar Pichai on the Decoder podcast, where they discuss a very AI-focused Google I/O.
- Anil Dash explains how we inhabit an "Internet of Creeps", where our data gets pulled, stored, and accessed in ways the average user could never expect by default. What he says is we need an Internet of Consent. Hell yeah.
- Richard MacManus's wonderful piece about The Three Gurus of 90s Web Design.
- And I'm currently reading Patrick McGee's "Apple in China", a detailed history of the deep-rooted relationship between, you guessed it, Apple and China. I've only read through the first part of the book as I write this, but I already have so many thoughts. Perhaps for another post when I'm done.
That's all for this issue! Now, I'm excited to get back to my giant backlog of drafts so I can finally share them with you.
Oh, and while I have you: A New Social just launched a Patreon and a merch shop that includes various products with our "People, Not Platforms" motto. If you like the work we're doing, we'd really appreciate your support!
Until next time 👋🏼
Thank you for reading! You can follow me on the social web on Bluesky, Mastodon, and Threads. And if you want to be notified of future issues of augment and my newsletter "Human-Generated Content," you can follow on RSS or subscribe here for free!