In October, I was on a visit to San Francisco. The city was operating at a speed so efficient it had devolved into an Aldous Huxley novel. I ended up stuck in a Waymo autonomous vehicle for 30 minutes because it couldn’t tell the difference between a highway and a middle-school pickup line. When I contacted customer support to explain that I had been kidnapped by their “car of the future,” they offered me a discount code for another Waymo, which also malfunctioned.
It was a minor yet revealing inconvenience. This is peak San Francisco, which means that this is also a look into the future of automated vehicles and problem-solving. My experience wasn’t just a one-off. It was a glimpse into how major tech companies are approaching innovation with our generation. They create problems that only their products can solve, turning the youth into both a testing ground and a captive audience. Today’s youth are expected to be both innovators and the ethics police, fixing problems of a system we didn’t create.
But sitting in that Waymo, watching it mistake a Trader Joe’s parking lot for Interstate 280, I realized this wasn’t just about broken robot cars. This was a perfect metaphor for what the elite are doing to my entire generation. They encourage young creators to make groundbreaking technology, then hand the creators and users — all parties being youth who want to experience our evolving world — the instruction manual for fixing those problems, usually written in incomprehensible legal language, and act shocked when we’re frustrated about it.
The sudden acceleration of artificial intelligence into the mainstream has pushed high school students into a moral landscape generations prior never had to navigate. We live at a unique intersection between privacy, morality, and innovation. In our current social climate, we are constantly aware of the equal and opposite reaction of every decision we make.
But the burden doesn’t just fall on young creators; it falls on young users too. I sit amongst friends in class, surfing through others’ social media, with judgmental gazes onto how other teens are using their nonconsenting platform. Whether or not the bikini they bought was made sustainably, or if the company acknowledged those who worked to create what they use, as if we have become moral auditors. We carry this weight due to a world where awareness is a constant. We have become ethical know-it-alls, yet who made that our job?
This expectation isn’t accidental. Scattered throughout SF are multi-million-dollar AI startups racing to be the first to tie wearable technology into discrete privacy. These LinkedIn celebrity hubs of techfluencers, Ivy League dropouts, and young prodigies work day and night, off of top ramen and Celsius energy drinks. Young 20-year-olds stacked on top of each other on bunk beds. They churn away, hoping to be rewarded with accolades, yet are met with the weight of others.
Major tech companies excel at creating problems and asking
I encountered this reality recently while interning at Mira, where we could get a glimpse into this world. Where I saw a clarified uncomfortable truth: Tech companies are creating profound ethical dilemmas, profiting from them, then shifting responsibility for solving those problems—and blame for causing them—onto two groups of young people. Young creators building the technology and young users consuming it.
During this time last year. Mira. Not yet a legitimate tech company was a hot topic among the general public. This was an expected controversy, and as the first to lay eyes on the new generation of wearable technology, I made a point to show that the media feedback would be less than satisfactory, because all innovative tech operates on a build first – ask later motto. Of course, this begs the question of what amount of risk is an ethical amount of risk. The answer? When risk is taken at the cost of privacy and safety.
As the initial pushback from privacy advocates mounted, the true concerns became apparent. The public debate wasn’t about transparency; it was about accountability and how to move forward in the new age of technology.
The tech industry, as it currently stands, is not ethically neutral. It prioritizes speed and efficiency over safeguards. Until responsibility is reclaimed by institutional powers, young people will continue to inherit problems labeled as opportunities.


























