- The Daily XP
- Posts
- 🤖 Lavender = The Terminator
🤖 Lavender = The Terminator
... and NSFW Loneliness Potion Nomi
Hello, fellow nerds 🧙
Wouldn’t it be great if we could submit a face scan to our computer, or game console, to verify our age in order to buy GTA 6 when it comes out? Yeah, we didn’t think so…
But apparently, the ESRB (Entertainment Software Rating Board) tried to slip this patent through the FTC, to which they replied with a “not gonna happen you stinky face-scanning slimeballs!” But in reality, it was just, “no”.
Here’s the daily quests in AI to complete today:
🤖 Lavender = The Terminator
🤐 NSFW Loneliness Potion Nomi
💘 Google Assistant Loves Romance
🧠 Mind Control is Getting Easier
Israel’s “Lavender” AI System is Terrifying if This is The Future.
So, it turns out the Israeli military has been using a hush-hush AI database called "Lavender", and it’s far from your delightful-smelling candle.
This AI system is being used to identify potential Hamas targets in their recent bombing campaign in Gaza. While AI in warfare is becoming more commonplace, the scale and scope of Lavender's use is, well, terrifying to put it bluntly.
According to intelligence sources who spoke to +972 Magazine and Local Call, Lavender had flagged a jaw-dropping 37,000 Palestinian men as potential Hamas or PIJ (Palestine Islamic Jihad) operatives. The keyword here is “potential”.
Flagging these operatives is one thing, but Lavender turned into the goddamn Terminator.
Here’s where it gets terrifying
The way Lavender works is it targets a potential Hamas operative in its database and then the IDF (Israel Defense Forces) takes 20 seconds tops to identify if the potential operative is, in fact, a real threat.
After that, all hell breaks loose. The IDF greenlights a “dummy bomb” to strike the threat when he’s home, eliminating the target and leveling the surrounding area.
Oh, and it doesn’t stop there.
During these bombings, the IDF allegedly had a "bring us more targets" mentality, and pre-approved "collateral damage" ratios of 15-20 civilians per low-level militant, which only increased per rank of militant.
If this is all true, holy shit, this is juuuust a smidge concerning and raises a plethora of moral and legal questions.
With the Hamas-run health ministry reporting 33,000 Palestinians killed in the past 6 months, and entire families being wiped out by these "dumb bombs", it's clear that this conflict has taken a horrific toll on innocent lives.
We’re all in on AI over here, but even we can see that no AI system, no matter how advanced, can ever replace human judgment, empathy, and the value of innocent lives. Especially, here. War is hell, and when algorithms are calling the shots, it's all too easy to forget the human cost.
NSFW Loneliness Potion Nomi
Look, we’re not here to judge. There’s something about the term “genuine human connection” that’s taken on a whole new meaning.
Enter Nomi AI
This new AI companion has a load of tricks up its sleeves. Intuition, wit, humor, imagination, personality traits, interests, short and long-term memory, emotional intelligence, and the ability to send pictures to name a few.
Surprise, surprise, the majority of Nomi users don’t want a buddy, pal.
Nomi’s can be entirely customized to be a friend, foe, or lover. Oh and you can decide whether to create a single companion or a whole crew.
However, according to Nomi CEO, Alex Cardinell, “most users have some sort of romantic relationship with their Nomi.”
What separates Nomi from other AI companions? Rapport. For example, if your AI has a shy personality, it’s going to take some time for them to open up.
Passion. Intimacy. Commitment.
You know, the primary ingredients of love, according to the triangular theory of love by Sternberg.
These pillars of love can’t come too easily… or can they?
Here’s an interesting take from Dr. K @HealthyGamerGG
“… an AI girlfriend or boyfriend might seem like the perfect solution to the minefield of dating and relationships. They’re becoming increasingly popular because they address the needs of lonely people who struggle to connect with people face to face.
But there’s another layer to this. What would get you REALLY addicted to your AI partner? If they mimicked real-life behaviors.
Because guess what, perfection is boring, and adding a little drama to the relationship can get you hooked. Plus, it’s way more painful to get rejected by a real-life human being than an artificial one.
Once in a while, your AI GF might start fights with you or ignore your calls. Maybe she won’t always say exactly what you want to hear when you want to hear it.
You know when someone tells you can’t do something? Suddenly you have this urge to do the very thing you’re not allowed to.
When your AI partner withholds certain things from you–compliments, time, sexting–you’ll only crave more.
And this is the perfect opportunity for companies to capitalize and profit. They’ll make your AI girlfriend withhold from you what you desire so you can keep using their services and as a result, make them more money 😲”
⌛️ Throwback Thursday
A little blast from the past.
In the beginning times of Google Assistant, the devs needed a way to teach the assistant how to speak conversationally. Back in 2016, well, there just weren’t a whole lot of ways to achieve this.
So what’d they do?
After going back and forth on the traditional data sets to do this, they decided to read the little baby Google Assistant bedtime stories from the greatest genre of all time: Romance. A whopping 11,000 novels worth.
Apparently, romance novels, and fiction all around, are great at being rich in language, phrasing, and grammar compared to nonfiction books leading to a much more conversational model. Which, reading any textbook ever, we can certainly attest to the nap-inducing jargon.
Now the point of using romance novels wasn’t so Google Assistant would whisper sweet nothings in your ear —although maybe that could’ve worked.
Daily Delight
Just something fun and interesting around the web.
The Revolutionary Brain Cap
Have you ever wanted to play video games with your mind, but didn’t want the hassle of getting a Neuralink device implanted in your brain? Oh doctor, do we have a treat for you.
Apparently, this puppy is “plug and play” and didn’t take much time for the test subjects to master at all. Who knows, maybe we inherently have a little bit of mind control floating around in our heads, but it takes a silly swim cap device to unlock it?
What’s happening inside the Realm
A list of side quests to explore and more.
More than 200 artists just signed an open letter condemning the use of AI to replace human artists.
In a hilarious turn of events, Discord accidentally viewbotted its own YouTube video and created history in the process.
Looks like Google's getting ready to put up a paywall for all that sweet, sweet AI-generated content. Because apparently, having access to the sum total of human knowledge and creativity isn't enough.
Perplexity is looking to serve up ads that brands use to influence the related questions sections. Hopefully, this won’t lead to all spam.
That’s all the quests we have for today. Check back tomorrow for more!
Looking to become a sponsor?
Was this email forwarded to you? Subscribe here.