Worth Knowing: Spotify's Mind-Reading & Uber's Magic Cart
Spotify finally lets you type what you actually want to hear, and Uber Eats learns to read your handwriting.

Worth Knowing: February 11, 2026
Forget the trillion-dollar infrastructure bets for a second. Today's news is about the stuff you actually use while waiting for your coffee.
Spotify Finally Gets It
TL;DR: You can now bully Spotify into playing exactly what you want by typing oddly specific prompts.
We've all been there: searching for "focus music" and getting a playlist that sounds like a spa having a panic attack. Spotify's new "Prompted Playlists" fixes that.
Instead of guessing genres, you can now type things like:
- "Songs for when I'm pretending to work but actually shopping for lamps."
- "Mid-2000s indie pop that makes me feel like the main character in a movie about rain."
- "Aggressive techno to drown out my open-plan office."
The AI analyzes the vibe, not just the metadata. It's rolling out now to Premium users in the US and Canada.
How to Use It
- Hit Your Library.
- Tap the "+" icon.
- Select "AI Playlist" and get weird with it.
What This Means for You
The algorithms are finally listening to you, not just your listening history. It turns the app from a jukebox into a personal DJ that actually understands context.
Source: Spotify Launches AI Playlists by Spotify Newsroom
Uber Eats vs. Your Scribbles
TL;DR: Uber Eats can now read your chaotic handwritten grocery list and fill your cart for you.
If you're still manually typing "milk, eggs, bread" into a search bar like it's 2015, stop. Uber's new "Cart Assistant" uses computer vision to read your handwriting (or a photo of a recipe) and instantly find the items.
You snap a pic. It builds the cart. You pay.

What This Means for You
It removes the friction between "I need groceries" and "I have groceries." It's a perfect use of AI: taking a boring, repetitive task (typing a list) and automating it so you can get back to doing literally anything else.
Source: Uber Eats Launches AI Cart Assistant by Grocery Dive
Meta Glasses: The "Pub Mode" Update
TL;DR: Meta's Ray-Ban glasses just got a "super hearing" update that isolates voices in loud rooms.
If you struggle to hear your friends at dinner because the restaurant decides to blast house music at 7 PM, you're going to like this. The new "Hear Better" feature uses the glasses' microphones to target the audio of the person you're looking at and dampen the background noise.
It's essentially the "cocktail party effect" on steroids.
What This Means for You
AI is entering the physical world. This isn't just a chatbot; it's software solving a biological problem. If you own a pair, update your firmware today and enjoy actually hearing the gossip.
Source: Meta Ray-Ban Update Adds Hearing Enhancement by The Verge
See you tomorrow.
Want to keep learning?
Get our free AI Starter Kit — 5 lessons delivered to your inbox.
Join readers learning AI in plain English. No spam, ever.
