Edens.nl: het laatste nieuws het eerst!

🔒
❌ About FreshRSS
There are new articles available, click to refresh the page.
☐ ☆ ✇ L H

YouTube Now Lets You Filter Shorts Out of Search

By: Jake Peterson

Internet videos have always been addicting, but short-form content is a whole other beast. Whatever platform you watch them on, these brief clips pull you in and don't let go, and, before you know it, you've mindlessly scrolled through hours of videos, most of which you'll never remember watching.

YouTube Shorts are no exception. But unlike TikTok or Instagram, short-form content is not the main source of videos on the platform. YouTube, of course, hosts long-form videos first and foremost, and is the sole reason why many of us visit the site or app. Shorts are just an afterthought, but an afterthought that YouTube pushes hard. You might hop on to watch a specific video, or check out new content from your subscriptions, but to do so, you'll have to push past rows of "Shorts" all vying for your attention. God help your afternoon if you accidentally click on one.

If you like YouTube Shorts, please disregard. But for the rest of us that just want to find and watch standard videos on YouTube, there's now some respite: As part of a larger update to search filters and content discovery, YouTube is now allowing users to filter out Shorts in searches. The company is pitching this as a way to separate searches between either Shorts or traditional videos, but you'll never catch me searching specifically for Shorts. What YouTube has done here, at least for users like myself, is to create a way to exclude Shorts from any particular search.

How to remove YouTube Shorts from search

To start, open YouTube and search for something. You should now see a series of options along the top of the display, one of which is "Videos." Choose it, and you'll reload the search with only long-form videos. Huzzah. You can also do the same from the greater search filters settings: On desktop, when the results appear, select "Filters" in the top right then look for "Type" on the left of the "Search filters" pop-up window. On mobile, tap the three dots icon, then choose "Search filters." On both platforms, you'll find the "Videos" option here.

Unfortunately, this isn't something you can set and forget: You'll need to choose this option every time you search for something, which is definitely a bummer. That said, at least there's some way to filter Shorts out of a search—especially if those Shorts were impacting your ability to find what you were looking for in the first place.

YouTube might not ever let us disable Shorts completely, but there are tools to get around them. You can choose to limit how many Shorts you watch in any given day—though the guardrails aren't necessarily strict. The company also lets you tell them to show you fewer Shorts on the home page, but if that's not enough, you can also install an extension to block Shorts from your feeds.

Other changes to YouTube filters

In addition to this new Shorts filter, YouTube made some adjustments to filters and sorting options. "Sort By" is now known as "Prioritize," and while YouTube doesn't say whether it changed the function, it does say the menu "aims to maximize utility." The company also changed the "View count" sort option to "Popularity." The menu still takes view count into consideration, but also watch time, to sort videos in a search by the algorithms' assumed popularity.

Finally, YouTube is removing the "Upload Date - Last Hour" and "Sort by Rating" options from search. The company says you can still find videos uploaded most recently from the "Upload Date" filters.

☐ ☆ ✇ L H

CES 2026: Ford Is Launching Its Own AI Assistant

By: Jake Peterson

Listen up, Ford drivers: You're getting a new AI assistant this year. During a decidedly low-key CES keynote, the company announced Ford AI Assistant, a new AI-powered bot coming to Ford customers in the early half of 2026.

While the company has plans to integrate the assistant into Ford vehicles directly, that isn't how you'll first experience this new AI. Instead, Ford is rolling out Ford AI Assistant to an upgraded version of its Ford app first, and plans on shipping cars with the assistant built-in sometime in 2027. In effect, Ford has added a proprietary version of ChatGPT or Gemini to its app.

How Ford AI Assistant works

Ford's idea here is to offer users a smart assistant experience directly tied to their Ford vehicle. In one example, the company suggests a customer could visit a hardware store looking to buy mulch. Said customer could take a photo of a pile of bags of mulch, and ask the assistant, "how many bags can I fit in the bed of my truck?" Ford AI Assistant could then run the numbers, and offer an educated estimate to how much mulch the customer can buy and take with them at one time.

Of course, other AI assistants can do similar calculations. Send ChatGPT the same photo, and ask the same question—specifying the model of your truck—and the bot will run the numbers itself. The difference, in Ford's view, is that Ford AI Assistant is connected to your vehicle specifically. It can read all the sensors in your car, so it knows, for example, how many people are currently traveling with you, your current tire pressure, or, really, anything and everything about your car. According to Doug Field, Ford's chief officer of EVs, digital, and design, the company's goal with the assistant is to offer answers customers can't get from other sources. ChatGPT certainly doesn't have access to your every sensor embedded in your car, so Ford does have the advantage there.

Ford didn't go out and build its AI tech by scratch, however. The company tells TechCrunch that Ford AI Assistant is hosted by Google Cloud, and is run using "off-the-shelf LLMs." Still, that likely won't have much of an impact on whether or not customers use this new assistant. Instead, that will come down to how useful they find the AI assistant in the app.

Will Ford AI Assistant actually be useful?

As someone who rarely uses AI assistants, I'd imagine I'd find little use for it if I owned a Ford. That being said, there are some times when it could genuinely be useful to have external access to your car's information. I could probably eyeball how many bags of mulch would fit in my trunk, but I can't tell you my exact odometer reading without starting up my car. The same goes for my tire pressure: It'd be helpful to know my tire pressure before getting in my car, to know whether I should be headed somewhere I can fill up before going to my destination.

Of course, there's also a privacy discussion to be had here. Modern cars are already privacy nightmares, but there's something a bit unnerving about an AI assistant that knows everything about my car.

☐ ☆ ✇ L H

CES 2026: 'Rescue Retriever' Wants to Help Firefighters Save Your Pets

By: Jake Peterson

We may earn a commission from links on this page.

No pet owner wants to think about what might happen to their animal friend in the event of a fire. As a dog owner, I know I don't. But fires do happen, and pets can't follow your family fire plan. Rescue Retriever wants to change that. I spoke with the company during CES' Pepcom event, and learned how it's working to make it easier for firefighters to find your pets in the event of a fire. (Rescue Retriever first launched back in March of 2024. The company was started by two brothers—one of whom is a former firefighter.)

During a fire, pets tend to run to where they feel safe—maybe that's under the bed, or somewhere tucked away in a room. That makes it difficult for firefighters to locate pets during an emergency: In a situation where seconds count, they don't have the ability to search every corner of a burning home. Unfortunately, as a result, pets sometimes don't make it out. Rescue Retriever wants to change that.

Rescue Retriever's smoke detector is installed near your pet's safe space

In an attempt to solve that problem, Rescue Retriever's smoke detector works a bit differently. It isn't supposed to be placed where your current smoke detector is; rather, the idea is to place it where your pet is most likely to go when scared. If you know they like to hide under your bed, for example, you can place the detector there. Once the device detects smoke during a fire, a bright light begins to strobe inside, spread out by a mix of holes placed throughout the device. (Even under the bright lights of Pepcom, the strobe looked intense to me.) That way, firefighters can look out for the strobing lights when trying to locate pets in your home. There's even an accompanying sticker you can place on your home's window, so firefighters know you have a pet to look out for.

The device is silent, so as to not scare your pets away from their hiding space. It wouldn't do any good to have a beacon for firefighters to locate your pet, only to have them scurry off because of an alarm.

The Rescue Retriever Fire Tag lights up in an emergency

fire tag
Credit: Lifehacker

New from the company this year is the Fire Tag, which brings the strobe light from the main unit to a small tag you can place on your pet's collar. The tag syncs up to the main unit using RFID. When the main unit detect smoke and starts strobing, the Fire Tag will pick up the signal and follow suit.

It's great that the main unit doesn't make a noise to scare your dog, but that doesn't guarantee they will go to the spot where the smoke alarm is installed—maybe the house smoke alarm scares them again or they don't follow their usual pattern and hide somewhere away from where you placed the main unit. Having a Fire Tag on their collar would give firefighters a better shot at locating the pet, wherever they may be in the home.

Rescue Retriever tells me these tags have a range of up to a quarter mile, so if your dog runs away from your house, their collar might still light up. Even still, the Fire Tag opens up, allowing you to place a tracker like an AirTag inside, so if your dog runs out of that quarter mile range, you can still track them if they come within Bluetooth range of a device within the Find My network.

Rescue Retriever says the Fire Tag is ready for production, and, when available, can be purchased either individually ($29.99), or as part of a larger bundle with the main Rescue Retriever unit ($89.99). The main unit can be purchased now for $39.99, but it also comes in bundles with accessories for varying prices.

☐ ☆ ✇ L H

CES 2026: Pawport's Smart Dog Door Launched With One Big Upgrade

By: Jake Peterson

If you've been following smart pet tech closely over the last couple years, you might know Pawport. I saw the company's smart pet door at the last two CES's, and was my first introduction to this specific product category. At the time, Pawport hadn't yet launched; now they have. But there's one key difference between the pet door Pawport showed off at CES 2025, and the one that eventually launched late last year.

Pawport's smart pet door

In a lot of ways, Pawport's official pet door is the product I saw last year: Pawport is made of aluminum, and is reportedly bulletproof. (The company has a model riddled with bullet holes on display). This smart door is designed to fit over an existing pet door in your home. If you don't have one, the company also sells an insert to install into your wall. It's weatherproof, has deadbolts on the top and bottom for security, and connects to smart home and voice control assistants. It can even work on battery power, though you can also plug it into power.

But Pawport tells me the big change between last year's prototype and its official launch model is the technology used in the smart tag that attached to your pet's collar. Now, Pawport uses a type of ultra-wideband (UWB) technology to communicate between the tag and the door. That gives the user more control over things like how far your pet has to be from the door before it opens: Maybe you prefer your dog to be two feet away inside before it opens, but five feet away when outside. Pawport says UWB also gives the door more security than before. Now, the tech enables end-to-end encryption between the dog collar and the door, so, according to the company, the door can't be hacked. It's a bold claim, but, if true, that should give pet owners some peace of mind.

pawports uwb pet collar
Pawport's UWB smart tag attached to a dog collar. Credit: Lifehacker

Pawport now also gives pet owners control over whether pets can run outside in adverse weather conditions. If there's rain or lightning, users can configure the app to lock the door from the inside, so pets can't scurry off into an unsafe backyard. That only locks one way, as well, so if the pet is already outside when the weather lock takes effect, they can still get back inside.

In addition, the company tells me that UWB has extended the battery life of the tag this go around. While the previous prototype was rated for around six months of battery life, the new model has an advertised battery life of one year, though users could see their batteries lasting up to 15 to 18 months.

Pawport plans to eventually sell a dedicated outdoor pet door as well, but at the moment, it only sells the indoor unit. That starts at $599 for the company's six standard colors in their medium sized unit, and $849 for one of eight "signature series" designs, which adds a polished wood look to the door. You can upgrade to a large unit for another $50, or the extra large unit for an extra $100.

☐ ☆ ✇ L H

CES 2026: AMD Just Showed Off 'Helios,' the Hardware That Will Power the AI Content in Your Feeds

By: Jake Peterson

When you come across an AI video on Instagram, or watch ChatGPT respond to your query, do you ever think about how that content was generated? Beyond the actual programs and prompts, generative AI takes an enormous amount of compute to support, especially as it skyrockets in popularity. As such, AI companies are looking for more power than ever, which means, of course, turning to those that make the hardware.

AMD calls Helios "The world's best AI rack"

During a Monday evening keynote, AMD's CEO Dr. Lisa Su showed off the hardware that will soon power everything from ChatGPT to the AI videos overwhelming your feeds. Su introduced—against a backdrop of dramatic music—"Helios," the company's upcoming AI rack, which packs a staggering amount of computing power into a rack that weighs nearly 7,000 pounds.

Each "cross-section" of these racks is powered by four key AMD pieces of hardware: the company's new AMD Instinct MI455X GPU, the new AMD EPYC "Veince" CPU, the AMD Pensando "Vulcano" 800 AI NIC, and the AMD Pensando "Salina" 400 DPU. There are some staggering stats here: Helios is capable of 2.9 exaflops of AI compute, and comes with 31 TB of HBM4 memory. It offers 43 TB per second scale out bandwidth, and is developed with 2nm and 3nm architecture. The rack has 4,600 "Zen 6" CPU cores, and 18,000 GPU compute units. In other words, this isn't your average piece of hardware.

Su's pitch is that the AI industry is in need of this additional compute power. She notes how the world used one ZettaFlop of computing power in 2022 on AI technology, compared to 100 ZettaFlops in 2025. (For the curious, one ZettaFlop has a value of 10 to the power of 21.) It's no surprise: AI is everywhere, and many of us are using it—whether we know it or not. Some of us are using it overtly, generating AI videos or running chatbots daily. But others are using AI quietly embedded in functions, like live translation.

Su welcomed reps from OpenAI, maker of ChatGPT, and Luma AI, which creates generative AI video content, to talk about how additional compute helps their programs. But during Luma AI's demonstration of its hyperrealistic video generations, all I could think about was how this type of content is already tricking people into thinking it's real, when it's entirely fabricated—not to mention the impact on human artists. AMD is optimistic about AI, and the data centers powering it, but critics have been pushing back, citing concerns with the impacts on the communities companies are building these data centers in.

Helios will likely be a major success for AMD, but it comes at an interesting time for tech, and AI in general. AI is more popular than ever, but it's also more controversial than ever. I see hardware like Helios only fueling the fire in both directions.

AMD Ryzen AI 400 series

In addition to Helios, Su announced the AMD Ryzen AI 400 series. These newest chips comes with either 12 "Zen 5" CPU cores and 24 threads, 16 RDNA 3.5 GPU cores, a 60 TOPS XDNA 2 NPU, and memory speeds of 8,533 MT/s. AMD says the Ryzen AI 400 series is 1.7 times faster at content creation and 1.3 times faster at multitasking when compared to Intel Core Ultra 9 288V.

These new chips will ship soon in a number of major PC brands, including Acer, Asus, Dell, HP, Lenovo, Beelink, Colorful, Gigabyte, LG, Mechrevo, MSI, and NEC.

☐ ☆ ✇ L H

CES 2026: This Water Dispenser Uses Facial Recognition to Track Your Cats' Drinking Habits

By: Jake Peterson

I spent some time with Petkit on the show floor of CES on Monday, where I got to see the company's automatic wet food cat feeder, its newest smart litter box, as well as an AI-powered water dispenser. Of course, a water dispenser should ideally do one thing: dispense fresh water for your pet. The Eversweet Ultra water dispenser, however, adds a number of smart features to the mix, some of which actually seem particular useful for owners of multiple cats.

The Eversweet's camera tracks your cats' drinking habits

I first learned about this water dispenser last week when Petkit officially announced it, and some of the specs and features are the same on paper as they are in person: This smart dispenser comes with a five-liter tank, as well as a 1080p camera with a 140-degree-wide-angle view. That allows you to see your cat approach the dispenser, as well as monitor their drinking habits. You can either choose to tap into a live stream to watch them as they drink, or check out prerecorded clips saved to the app. Petkit tells me the app even chops up clips for you to post to your socials automatically—if you think your friends and family want close-ups of your cat taking a drink.

One of Petkit's selling points here, however, is the health tracking. The clips aren't just for posting to stories; instead, the camera can track how your pet drinks over time, noting any discrepancies or changes. You can see daily, monthly, and even yearly drinking trends, so you know whether your cat is drinking an abnormal amount of water for January. I'd want to consult a veterinarian before definitively stating whether this type of tracking is effective, but I can imagine it might be useful to know whether your cat changes its behaviors—something you might not notice if you aren't diligently watching them drink.

That doesn't change if you have multiple cats, either. The Eversweet's camera uses machine learning for facial recognition, and can identify individual cats in your home. That health tracking can apply to specific cats as they drink water, so you shouldn't have results mixed between your various pets. If you really want to know when one cat is drinking water over another, this should help you keep track.

Eversweet's water features

While some water dispensers recycle the water, the Eversweet disposes of the water after a set period of time. I knew about this from the press release, but it was fun seeing the machine dump the water in real time: The pump that serves to keep the water running cuts out, and the entire mechanism lifts up to reveal a drain. The water dumps out, and a spout refills the tank, only to dump the water again. This is an attempt to clean out the tray, and remove any bits of hair or debris that might've been left behind.

Petkit also made a point of showing me that the dispenser's pump is always running, which turns the Eversweet into a bit of a water feature—not that you should buy one just for decoration. The idea is, cats are a bit more attracted to the moving water than still water, which encourages them to drink. I don't have a cat, so I can't speak to that, but it's an added aesthetic perk—until the water dumps out, of course.

Petkit says it's aiming for an April release for the Eversweet, which will be available on both Amazon as well as through Petkit's official site.

☐ ☆ ✇ L H

CES 2026: Petkit's Yumshare Daily Feast Is an Automatic Wet Food Feeder for Your Cat

By: Jake Peterson

I don't have a cat, but I do have a dog, and my dog can be a picky eater. After months of trial and error, the only way we could get him to consistently eat breakfast and dinner was with wet food. There is no scenario in which I or someone watching my dog can outsource the task of feeding him, but that's not the case for cat owners. I know that cats sometimes stay home alone for extended periods of time, where an automatic food dispenser becomes essential. I can only imagine, then, that cat owners who, like me, feed their pet wet food, now have a challenge: Someone has to be there to feed the cat.

That's what intrigues me about Petkit's Yumshare Daily Feast, an automated cat feeder that specifically works with wet food. I covered the Yumshare briefly when Petkit first announced it last week, but I got some hands-on time with the product today on the CES show floor. And while the device I saw isn't going to market just yet, I was intrigued.

The concept itself is pretty simple, but a bit more complex in execution. As you might expect, the feeder distributes portions of food for your cat to eat when you're not available—only here, that food is wet, not dry. To achieve this, Petkit outfitted the Yumshare with a chamber that can hold up to seven pouches of wet food at once. These pouches are designed for the Yumshare alone: Right now, the only company making pouches is Food Chain, which uses Petkit's proprietary design to fit the Yumshare, but it's possible more companies more jump on board in the future.

yumshare food container
The food chamber that contains seven pouches of wet food. Credit: Lifehacker

When the Yumshare first opens one of these pouches, it sets a timer: After 48 hours, the pouch is discarded. In the meantime, the machine can automatically dole out portions set by the user, into a cup, which is stored in a separate chamber in the device. Once the portion is distributed, the pouch is sealed until the next portion is needed. The machine also uses UVC ultraviolet light to sanitize the package, with the goal of eliminating bacteria.

The automation extends to the portion of wet food itself, as well. Whether your cat eats all of the food, some of the food, or none of the food, the Yumshare can discard what's left when it detects it's time to do so. That might be the AI camera identifying that the food has dried up, or an internal clock recognizing that too much time has passed for the food to be safe to eat. Either way, the food drops into a waste compartment, which can hold up to 15 cups at once. The stand remains empty until it's time to hand out another portion of food.

How large or small each portion is, as well as how often portions are distributed, is up to you. Petkit showed me a mockup of the Yumshare's app, which lets you customize this feeding schedule. The app will also tell you how many portions have been distributed, and how many are left in the chamber. You can also take a look at a live video feed of your cat, via the 4K camera embedded in the device, or view a clip history of your cat's feeding habits.

My only question is the lack of refrigeration: Petkit says that there is no need for it, since the pouches are all sealed until use, closed in between portions, and the chamber uses ultraviolet light to sterilize for bacteria. But I'd want to confirm there are no health risks to keeping an opened pouch unrefrigerated for up to two full days, even with that UVC light. Assuming that's true, I'm quite impressed with this prototype. If I were a cat owner that frequently needed to leave my pet alone at breakfast and dinner, this is something I'd have my eye on.

☐ ☆ ✇ L H

CES 2026: Hisense Just Announced the RGB MiniLED Evo

By: Jake Peterson

Hisense is all about TVs at this year's CES—specifically, how those TVs display color. If the company's keynote is any indication, Hisense is extremely invested in leading the charge in color reproduction. Though its tagline this year is "Innovating a Brighter Life," the pitch is less on how bright their TVs are, and more on how true to life their colors are, especially when it comes to accurately displaying the filmmakers' original intent. Will consumers buy a TV because their favorite movie looks a bit more how the director intended it to be? I'm not sure. But that's largely the idea behind Hisense's new RGB MiniLED Evo.

RGB MiniLED Evo

rgb miniled evo
Credit: Lifehacker

Hisense's biggest announcement of the day is its RGB MiniLED Evo. This iteration of the company's RGB MiniLED technology comes with three key upgrades: First, there's the "Chromagic" precision backlight. The company says this technology helps avoid color bleeding and tint shifting when watching content. It also boosts the colors offered by existing RGB MiniLED technology: Hisense says this new standard can display 110% of the BT.2020 color gamut with four-primary colors (red, green, blue, and sky blue, or cyan), while RGB MiniLED can achieve 100% BT.2020 with the standard three primary colors. The company says Evo also comes with AI-powered RGB color dimming with "134 bit precision."

Second, RGB MiniLED Evo has a "Hi-View AI engine," to sync color across the TV's backlight. This engine comes with a three-core RISC CPU and a 2TOPS NPU for AI processing. Hisense claims the chip provides a 40% boost in computing performance, a 70% improvement in scenario recognition computing power, and 100% upgrade in scenario adaption computing capability. The company says the RGB MiniLED Evo can help display shows and movies as the original creators intended them to look. Hisense compared it to OLED, which it says displays content with too little brightness, and QD MiniLED, which it says compromises the color by being too bright.

Hisense's RGB MiniLED—not necessarily just Evo—also supposedly cuts down on blue light. Hisense says its new display tech emits 60% "less harmful" blue light than QD OLED, though the science doesn't necessarily support the claims that blue light is any worse for you than other light colors. The company also says RGB MiniLED is 30% more energy-efficient than QD OLED, though it didn't include any references to back up those claims in the keynote. Finally, Evo comes with an AI calibration feature, which Hisense touts as a way for users to turn their TVs into professional reference monitors.

The company says RGB MiniLED evo is coming to its product lineup, including the U8, UX, U9, and U7 series, this year.

Other Hisense announcements at CES 2026

hisense mxs announcement
Credit: Lifehacker

While the RGB MiniLED Evo was undoubtedly Hisense's biggest announcement this year, the company also ran through some other news at the keynote. First, there's the XR10, a new projector that can reach up to 300 inches. It comes with liquid cooling, lens shift supporting a 4K projection, and 6,000 lumens of brightness.

Hisense also announced the MXS MicroLED TV, which can scale up to 163 inches. Despite devoting only a moment to the TV, the company says the MXS won an CES 2026 Innovation Award. Hisense also made some announcements about its TV OS, including features like weather, calendar, and integrations with other smart home devices. But perhaps most notably, Hisense announced a partnership with Microsoft to bring both Copilot and Xbox Cloud Gaming to its TVs. That's pretty big news.

Finally, if you're both a Hisense and a FIFA fan, you can buy products from the "Hisense Elite Collection," which are designed specifically for the 2026 FIFA World Cup. I'm not sure how big that crossover is, nor do I think I'd buy a TV because it was made for one soccer event, but it was part of Hisense's announcement.

☐ ☆ ✇ L H

My Five Favorite Things I Saw at CES Unveiled 2026

By: Jake Peterson

While CES doesn't technically kick off until Tuesday, the conference gets a bit of a soft launch with CES Unveiled. This event hosts a ton of companies, all proudly showing off their latest products and concepts in one giant room. While there's plenty to write home about, five products in particular this year caught my eye:

Tombot

tombot's jennie
Credit: Lifehacker

Tombot's robotic puppy, "Jennie," isn't supposed to be a pet replacement; Jennie is specifically designed to help people with Alzheimer's. The bot is a healthcare device, and is made to not only comfort owners, but to monitor "sundowning," or the confusion that some living with Alzheimer's experience in late afternoon and at night.

I can't speak to the medical claims, but Tombot impressed me. I've seen products like this before, but what struck me was the realism. That's not to say Tombot's robot tricks you into thinking there's a real puppy on the table. But the company has designed the bot with enough motors and sensors to make it realistic enough. When you look at Jennie, she looks at you; when you move, her face reacts in kind, powered by cleverly placed motors. There are capacitive touch sensors to react to touch, light sensors to adjust to the lighting of the room, gyros for orientation, and microphones to respond to sounds.

Jennie is designed to be interactive: You can call its name—either Jennie, or a name you set in the app—and it responds to the call, potentially with a bark. Tombot says that it hired a number of 10-week-old lab puppies to record the voices for its bot. In all, Jennie will have about 1,500 unique behaviors when Tombot launches her this year.

Tombot told me that Jennie is designed to last all day on a single charge. When you aren't home, Jennie can drop into a sort of low power mode, which should last over a week. When you come back, Jennie should immediately welcome you home. Tombot says its bot will cost $1,500 at launch, but will offer financing options.

CubicScreen

cubicspace 3d photo
Credit: Michelle Ehrhardt/Lifehacker

What if you could turn your iPhone into a 3DS? That's what I took away from the premise of CubicSpace's CubicScreen, anyway. The company makes a screen protector for your iPhone embedded with an optical filter that allows you to view spatial photos and videos in 3D, without the need for glasses or a separate device. It's your 2D iPhone in 3D.

I was a bit skeptical walking past the booth, but, in practice, the tech really works. CubicSpace had some 3D photos and videos already stored in the CubicScreen app on their demo iPhones, and, when you're looking at the screen, they really did appear three dimensional. This isn't "pop out of your screen" 3D, mind you; rather, it's a depth effect. If you've ever used the 3D feature on a 3DS, this is that experience: You'll be able to see the depth behind subjects in your photos and videos, which adds to the immersion of the image.

Part of the effect uses eye tracking to adjust the effect as you view the image. To that point, the app can support zooming while retaining the 3D effect. It's a little trippy: You zoom in on a picture, and, most of the time, it snaps into 3D instantly. Sometimes, the effect falls out, and you can tell you're looking at a distorted image. But most of my experience, the effect held.

This 3D effect wasn't just a trick of editing, either: CubicSpace took a photo of Lifehacker's Associate Tech Editor, Michelle Ehrhardt at the booth, which instantly took on the 3D effect. The screen overlay and the software appears to work fast.

CubicScreen will cost $79 when it launches. There is currently an order page, but they aren't accepting payments yet.

Allergen Alert

allergen alert
Credit: Jake Peterson/Lifehacker

If you have a food allergy, you know the stress that comes when you're not in control of your meals. There's a risk to eating food you didn't make yourself, and for some of us, that risk isn't an option.

That's why I found Allergen Alert's "the mini lab" so intriguing. The idea is, when you want to know whether your food contains a certain allergen, you can scoop a sample of it into the provided container, place that container into the mini lab, and within two minutes, you'll have a positive or negative result. The device itself isn't large—about the size of a portable speaker—which makes it feasible to take out to restaurants and other people's houses.

Right now, the company says the mini lab can detect milk and gluten, up to five parts per million, but the goal is to detect other major allergens, like egg, fish, soy, sesame, peanuts shellfish, and tree nuts. Of course, I have no way to actually test whether it can detect milk and gluten at this time, so I can't necessarily endorse the product. In fact, it isn't currently available: Allergen Alert tells me the product is supposed to launch in September for $200.

However, if Allergen Alert is correct in its claims, this product could be a game changer. A two-minute check on that supposedly allergen-free meal could literally save someone's life, and offer peace of mind to those who could have serious adverse effects from ingesting an allergen.

Birdfy Hum Bloom

birdfy hum bloom
Credit: Jake Peterson/Lifehacker

Birdfy brings a new smart bird feeder concept to CES just about every year. And while the company had its previous models on display at Unveiled this year, its newest prototype grabbed my attention.

The Hum Bloom is specifically designed for hummingbirds, with an 4K camera that can shoot up to 120fps slow-motion video. While that'd make for some dramatic landings for any bird, it's particularly ideal for capturing hummingbirds that flap their wings hundreds or thousands of times per minute.

Birdfy says the Hum Bloom's AI can identify more than 150 hummingbird species, so you'll know which type of bird you're watching in slo-mo on your phone. Birdfy claims the feeder is "leak-proof," and also comes with ant moat to keep out bugs. Personally, I'm stuck on Birdfy designing a slo-mo camera system just for hummingbirds. I look forward to seeing that footage from reviewers in the future.

Opsodis 3D speaker

opsodis 3d speaker
Credit: Jake Peterson/Lifehacker

This is one of those products my picture will simply not do justice. In order to understand why I was so taken by this speaker, you'd need to hear it for yourself.

Admittedly, I wasn't expecting much when I agreed to demo the speaker. At first glance, the Opsodis just looks like your typical wireless speaker. That changed once the rep played a video on a connected iPad: Suddenly, I was hearing sounds next to my ears; behind my ears; around my ears. I wasn't really paying attention to what was happening on screen, because I was too distracted wondering how I was hearing everything I was hearing from a relatively tiny speaker directly in front of me.

As it turns out, this wouldn't work if I had the speaker placed just about anywhere. The demo used Opsodis' "narrow mode," one of three audio modes the speaker is capable of. Narrow mode is the stronger spatial audio experience, but requires the speaker to be placed close and directly in front of the user: specifically, 60 centimeters away, or 23.62 inches. "Wide mode," which I didn't try, offers a "softer" spatial audio experience, while the third mode simulates standard stereo audio.

While this won't necessarily offer a movie theater-like experience just by placing it in front of your TV, this was some of the most fun I had at CES Unveiled tonight.

☐ ☆ ✇ L H

Punkt Has a New Smartphone for People Who Hate Smartphones

By: Jake Peterson

If you've ever looked into trading in your smartphone for a dumbphone, you might have stumbled across Punkt. The German-based company's MP01 and MP02 phone were purposely not "smart;" rather, they were minimalist slabs of plastic, sporting a tiny display and an array of large, physical buttons. The point of owning one of these Punkt devices isn't to sit scrolling on your smartphone for hours on end; it's to use your phone when you need to—privately, at that.

The MC03 looks like any other smartphone at first

The company's MC0 line flips the script a bit, and that doesn't change with the latest: MC03. While there's still a focus on privacy and minimalism, this newest device is virtually indistinguishable from other Android smartphones on the market—at least, in outward appearance. Gone are the physical buttons and tiny display; now, you have a large 120Hz OLED display, complete with a selfie camera at the top. Flip the MC03 around, and you'll find a set of four rear cameras, corralled in the top-left corner of the back. Aside from the large "Punkt" logo in the bottom right, this really could be any other phone.

What separates the MC03 from phones from Samsung or Motorola is what's on the inside, including, namely, the OS. When you open the phone, you aren't greeted by a grid of app icons and widgets. Instead, you see a list of app and function names, without icons or colors. This is just about as simple an interface as you can expect from a device with a modern smartphone display, which may appeal to those who are looking for a minimalist experience.

That's because the MC03 isn't running Android. Like previous Punkt phones, this device runs AphyOS, an operating system built by Apostrophe. This custom OS is advertised as a privacy-focused operating system, something Punkt runs with for the MC03. According to the company, AphysOS can block tracking and profiling tools, and keeps out bloatware, hidden apps, and background services. The OS can also reportedly fight against spying with "hardened code" to block attacks.

The company says this new phone separates data and functions in two key "repositories:" First, there's the "Vault," which includes Punkt-approved apps and that minimalist UI. Proton is a trusted company here, so you can expect to find Proton Mail, Proton Calendar, Proton Drive, Proton VPN, and Proton Pass in the Vault. Second, there's "Wild Web," which lets you install any app you want, against a strict system of safeguards and privacy settings. You can choose to download apps from either Punkt's privacy-focused app store, featuring programs approved by both AphyOS and Punkt, as well as a store with "widely available apps."

In addition to that display, Punkt says the device comes with other hardware perks, like a 5,200mAh removable battery—a rarity with modern smartphone—IP68 water and dust resistance, and a 64MP camera.

Privacy isn't free on the MC03

All those features come at a recurring cost, however. After one year, you'll need to pay for AphyOS, which charges $10 per month. You can also choose to subscribe for three years for $129 ($5.38 per month), or five years for $199 ($4.15 per month). The phone itself costs $699, which is relatively expensive for a phone of this caliber, so whether or not the privacy and minimalism perks are worth the price will really come down to the individual customer. (I'm a huge privacy advocate, but the MC03 would definitely need to impress me before I commit to a subscription model for its OS.) If you don't want a new phone, there are hacks to dumb down your existing device into something more minimal—and, of course, there are steps you can take to protect your privacy on any device.

The MC03 isn't available yet. Punkt announced the new phone ahead of CES next week, and says the new device will launch in North America this spring. Pre-orders are currently available in Europe.

☐ ☆ ✇ L H

Petkit Just Announced Three New Smart Devices for Your Pets

By: Jake Peterson

Petkit just announced three new smart pet tech products ahead of CES. Despite the conference not kicking off until Tuesday, the company shared details around its upcoming devices that are likely to launch this spring. Perhaps unsurprisingly, Petkit advertises all three pet devices as "AI-powered," and while that's been a go-to gimmick for companies since 2023, there are some unique applications here—assuming the devices actually do what Petkit says they do.

Purobot Crystal Duo litterbox

Petkit says the Purobot Crystal Duo is the first AI-powered open-top litter box, and that the Purobot uses the tech for health detection. This device has an 720p AI-powered camera that monitors your cat's stools as well as their behavior when visiting the litter box. The Purobot looks at things like stool consistency and urine pH, as well as "vocalizations," and noting all in the Petkit app. You can take any of those findings to your vet if you have concerns.

Petkit says the Purobot uses crystal litter to absorb urine and dehydrate waste to manage odors. The company claims users can wait 30 days before needing to scoop the contents of the litter box, which also comes with disposable trays. The company plans to launch Purobot in July, on both Amazon as well as Petkit's official site.

Eversweet Ultra water dispenser

Petkit's new water dispenser also uses an AI-powered camera to track health habits, only this time through your pet's "hydration habits." The device ships with a 1080p camera with a 140-degree wide-angle view that tracks your pet as they drink, even in the dark. The Eversweet supports facial recognition for multiple pets, which, in theory, allows it to track the habits of specific pets drinking from the same device.

The dispenser comes with a five-liter tank, which Petkit claims supports two full weeks of drinking water for one pet. The device detects when the water is low, and refills the bowl when needed, passing through a "antibacterial filter cube" to prevent bacterial growth. Any "waste" water is not recirculated through the dispenser, and is instead separated, so pets should always have fresh water while the reservoir is full.

Petkit says the Eversweet Ultra may launch in April, though that's only an estimate. It, too, will be available on both Amazon, as well as Petkit's official site.

Yumshare Daily Feast

Perhaps Petkit's biggest announcement is the Yumshare Daily Feast, an automatic feeder that works with wet food. This feeder can portion out fresh food for cats and dogs on a custom schedule, while tracking your pet's eating habits to build "eating insights." Those insights are partially informed by the feeder's built-in 1080p AI-powered camera, with a 140-degree wide-angle view. It also supports facial recognition for multiple pets.

Petkit says the Yumshare Daily Feast supports up to seven days of fully automated wet meals—the keyword being "automated." The machine opens each food pack itself, and each portion is "kept within its freshness window." That means if your pet doesn't finish its meal, the device will take away the leftovers, and uses ultraviolet light (specifically UVC) to sanitize before delivering the next portion.

Petkit estimates it will release the Yumshare Daily Feast in April, both on Amazon as well as Petkit's official website.

As with any new products, we'll have to wait until reviewers can perform hands-on testing before deciding whether these devices can actually achieve what they claim to. However, if Petkit's products are up to snuff, they could offer pet owners some legitimate benefits. The wet feeder really could be great for anyone who needs to leave their cat for long periods of time. The Purobot and Eversweet could also provide some essential health insights ahead of a vet visit: If you're seeing reports that your cat is crying while using the litter box, or isn't drinking as much water as they usually do, you could get ahead of a health problem by notifying your vet early.

☐ ☆ ✇ L H

This Dashcam Is My Tech 'Upgrade of the Year'

By: Jake Peterson

We may earn a commission from links on this page.

To be alive in 2025 is to assume you are being recorded at all times. There are cameras just about everywhere we go these days: Security cameras, video doorbells, and smartphones are frequently recording, and so ubiquitous that I tend to figure that if I go out in public, I'm being watched.

But when you're driving, it can be a different story. Sure, there are cameras on the road—either street cameras or cameras built directly into modern cars—but there are plenty of moments behind the wheel where nothing you do is recorded. You could view this as a nice reprieve from the surveillance of modern life, but it also means that if you get into an accident, it could quickly turn into a "he said, she said" incident. Someone could reverse directly into your car, but because there were no cameras around to document the accident, your insurance company could throw their hands in the air and make you both pay for something that wasn't your fault.

I've never been in that situation, but I was tired of worrying about it. My car is on the older side, and doesn't come with any of the modern safety features newer cars now include—including cameras. Where something like a Tesla records everything in and around the car, my vehicle records nothing. If someone accidentally hit my car, or worse, intentionally tried to scam me, I could be out of luck, and without clear evidence I wasn't at fault.

You won't regret buying a dashcam, even if you never use it

After following one too many threads online sharing such horror stories, I decided it was time to get myself a dashcam.But before I could, I was gifted one last Christmas—a Redtiger 4K dashcam. It was easy enough to set up, though even after a full year of use, I'm not taking full advantage of it yet: The main unit attaches to my front windshield, with a cable that plugs into the cigarette lighter port, but this particular model also comes with a rear camera that requires a little extra maneuvering to install. I've been a bit lazy on that front, but I should get cracking, since it would be helpful to have a camera protecting the back of my car, too.

I almost wish I had some harrowing tale to tell that shows off how the dashcam saved me during the past year, or even a story about capturing some wild driving habits from my fellow drivers. But, spoiler alert: I haven't actually had to put it to use, as I've fortunately not been in an accident, or even experienced an interesting close call. But the peace of mind that comes with knowing that if an otherwise ambiguous fender bender has been refreshing. It's nice to know when I'm driving on a particularly busy road that I have a little extra protection should someone jump their lane or decide to text and drive.

There are so many different dashcam models, so I won't try to make the case that mine specifically is the one that everyone should buy. But there are a few things I like about it: First, the main unit records in 4K, which means your videos will be clear enough to use in the event you need to prove yourself innocent. While 4K takes up more recording space than 1080p, this model is designed to record over itself when it fills up. Since you probably don't need all your driving footage, you don't really need to worry about running out of space, and can grab the file when something actually happens. There are other features that I don't use, like wifi connectivity as well as a smart app, but I prefer to just grab the footage off the included SD card—at least, I would if I ever needed it.

☐ ☆ ✇ L H

Gemini 3 Flash Outperforms Gemini 3 Pro and GPT 5.2 In These Key Benchmarks

By: Jake Peterson

The AI wars continue to heat up. Just weeks after OpenAI declared a "code red" in its race against Google, the latter released its latest lightweight model: Gemini 3 Flash. This particular Flash is the latest in Google's Gemini 3 family, which started with Gemini 3 Pro, and Gemini 3 Deep Think. But while this latest model is meant to be a lighter, less expensive variant of the existing Gemini 3 models, Gemini 3 Flash is actually quite powerful in its own right. In fact, it beats out both Gemini 3 Pro and OpenAI's GPT-5.2 models in some benchmarks.

Lightweight models are typically meant for more basic queries, for lower-budget requests, or to be run on lower-powered hardware. That means they're often faster than more powerful models that take longer to process, but can do more. According to Google, Gemini 3 Flash combines the best of both those worlds, producing a model with Gemini 3's "Pro-grade reasoning," with "Flash-level latency, efficiency, and cost." While that likely matters most to developers, general users should also notice the improvements, as Gemini 3 Flash is now the default for both Gemini (the chatbot) and AI Mode, Google's AI-powered search.

Gemini 3 Flash performance

You can see these improvements in Google's reported benchmarking stats for Gemini 3 Flash. In Humanity's Last Exam, an academic reasoning benchmark that tests LLMs on 2,500 questions across over 100 subjects, Gemini 3 Flash scored 33.7% with no tools, and 43.5% with search and code execution. Compare that to Gemini 3 Pro's 37.5% and 45.8% scores, respectively, or OpenAI's GPT-5.2's scores of 34.5% and 45.5%. In MMMU-Pro, a benchmark that test a model's multimodal understanding and reasoning, Gemini 3 Flash got the top score (81.2%), compared to Gemini 3 Pro (81%) and GPT-5.2 (79.5). In fact, across the 21 benchmarking tests Google highlights in its announcement, Gemini 3 Flash has the top score in three: MMMU-Pro (tied with Gemini 3 Pro), Toolathlon, and MMMLU. Gemini 3 Pro still takes the number one spot on the most tests here (14), and GPT-5.2 topped eight tests, but Gemini 3 Flash is holding its own.

Google notes that Gemini 3 Flash also outperforms both Gemini 3 Pro and the entire 2.5 series in the SWE-bench Verified benchmark, which tests the model's coding agent capabilities. Gemini 3 Flash scored a 78%, while Gemini 3 Pro scored 76.2%, Gemini 2.5 Flash scored 60.4%, and Gemini 2.5 Pro scored 59.6%. (Note that GPT-5.2 scored the best of the models Google mentions in this announcement.) It's a close race, especially when you consider this is a lightweight model scoring alongside these company's flagship models.

Gemini 3 Flash cost

That might present an interesting dilemma for developers who pay to use AI models in their programs. Gemini 3 Flash costs $0.50 per every million input tokens (what you ask the model to do), and $3.00 per every million output tokens (the result the models returns from your prompt). Compare that to Gemini 3 Pro, which costs $2.00 per every million input tokens, and $12.00 per every million output tokens, or GPT-5.2's $3.00 and $15.00 costs, respectively. For what it's worth, it's not as cheap as Gemini 2.5 Flash ($0.30 and $2.50), or Grok 4.1 Fast for that matter ($0.20 and $0.50), but it does outperform these models in Google's reported benchmarks. Google notes that Gemini 3 Flash uses 30% fewer tokens on average than 2.5 Pro, which will save on cost, while also being three times faster.

If you're someone who needs LLMs like Gemini 3 Flash to power your products, but you don't want to pay the higher costs associated with more powerful models, I could image this latest lightweight model looking appealing from a financial perspective.

How the average user will experience Gemini 3 Flash

Most of us using AI aren't doing so as developers who need to worry about API pricing. The majority of Gemini users are likely experiencing the model through Google's consumer products, like Search, Workspace, and the Gemini app.

Starting today, Gemini 3 Flash is the default model in the Gemini app. Google says it can handle many tasks "in just a few seconds." That might include asking Gemini for tips on improving your golf swing based on a video of yourself, or uploading a speech on a given historical topic and requesting any facts you might have missed. You could also ask the bot to code you a functioning app from a series of thoughts.

You'll also experience Gemini 3 Flash in Google Search's AI Mode. Google says the new model is better at "parsing the nuances of your question," and thinks through each part of your request. AI Mode tries to return a more complete search result by scanning hundreds of sites at once, and putting together a summary with sources for your answer. We'll have to see if Gemini 3 Flash improves on previous iterations of AI Mode.

I'm someone who still doesn't find much use for generative AI products in their day-to-day lives, and I'm not entirely sure Gemini 3 Flash is going to change that for me. However, the balance of performance gains with the cost to process that power is interesting, and I'm particularly intrigued to see how OpenAI responds.

Gemini 3 Flash is available to all users starting today. In addition to general users in Gemini and AI Mode, developers will find it in the Gemini API in Google AI Studio, Gemini CLI, and Google Antigravity, the company's new agentic development platform. Enterprise users can use it in Vertex AI and Gemini Enterprise.

☐ ☆ ✇ L H

Apple Just Changed the Way You AirDrop With Strangers

By: Jake Peterson

AirDrop is one of Apple's best features. I use it on a daily basis to share files between my various Apple devices, but it really shines when I'm sharing stuff with other people, or vice versa. It can be tricky to find a quick solution to send larger files. Emails have too low a file size limit, chat apps can compress files, and cloud storage can fill up fast, but AirDrop is simple, built-in, and reliable. It even works with Android now, albeit just the Pixel 10.

If AirDrop has one flaw, it's that it's not particularly easy to use with strangers. Apple has changed how this side of AirDrop works over the years. For the longest time, you had two AirDrop settings: "Contacts Only," which only lets your saved contacts find your device for AirDropping files, and "Everyone," which leaves your AirDrop open to anyone with an iPhone to send you stuff. This was convenient when you needed to share files with strangers, but inconvenient if you left it on: Anyone with an iPhone could see your iPhone and send you anything—like, say, a bomb threat while on an airplane. Not good.

Then, Apple changed this latter functionality to "Everyone for 10 Minutes." Ever since, if you want to open up your AirDrop to people outside your contacts, you have to manually enable this toggle, which will only stay open for, well, 10 minutes. After that, it switches back to "Contacts Only." That's an improvement in security, but not in convenience. If you're ever in a situation where you need to AirDrop something to someone relatively frequently but you don't want to add their contact to your iPhone, you'll be switching back to "Everyone for 10 Minutes" every 10 minutes.

iOS 26.2, Apple's newest iPhone update at the time of this writing, introduces a solution—AirDrop codes. This feature forces anyone not saved in your contacts who wants to share something with you via AirDrop to ask for a one-time code first. Once you share that code, that user is temporarily saved on your iPhone for 30 days, allowing you to AirDrop repeatedly without issue. After those 30 days are up, the user leaves your iPhone, and you don't need to worry about pruning your Contacts app down the line. (This same functionality also applies to AirDrop on iPadOS 26.2 and macOS 26.2.)

How to AirDrop with strangers using AirDrop codes

Here's how this new AirDrop experience works with strangers going forward. Let's say you're at a conference and you meet someone who wants to send you some relevant materials via AirDrop. You set your AirDrop settings to "Everyone for 10 Minutes," they see your contact, and attempt to send you the file.

On your end, you see the request, with a "Continue" option: Once you tap it, you'll see the AirDrop code on your iPhone, iPad, or Mac. You can tell the code to the sender, who can enter it on their device. If successful, the file will be shared like any other AirDrop interaction.

As stated above, this allows you to AirDrop with this contact for 30 days without needing to bother with another AirDrop code. But if you're done sharing with the stranger for good, you can remove their temporary contact early. Head to the Contacts app, hit the back button in the top left if applicable to head to Lists, then choose Other Known. Here, you'll see any temporary contacts generated from previous AirDrop sessions, which you can delete ahead of that 30 day deadline. Otherwise, your device will take care of it once that timeframe has elapsed.

☐ ☆ ✇ L H

Apple's Latest iOS Update Includes a New Way to Receive Notifications

By: Jake Peterson

There are fewer and fewer hardware differences between iPhones and Androids as the years go on, but back in the day, that was far from the case. At one point, many major Android devices came with dedicated LEDs that would shine whenever you received a notification. It was a passive way to know whether you had something on your phone to attend to, without having to actually wake up the display and risk getting unnecessarily sucked into your device.

iPhones have never had this specific feature, but Apple included a workaround for anyone interested in a similar experience. For years, you've been able to dive into Accessibility settings to turn your iPhone's LED flash into a notification light. Any time you received a text, app notification, or call, your camera flash would go off, ensuring you didn't miss an important update. This can be helpful both those who are hard of hearing, and who wouldn't be able to rely on audible alerts, or anyone who keeps their phone on silent, but would like a visual cue that they have a new notification.

For the first time in years, Apple is updating its flash alerts feature. With iOS 26.2, which the company released on Friday, you now have the option to have your iPhone's display itself flash for new alerts. You can choose to make the display the only light that flashes, or to use the feature in tandem with the LED flash, which I think makes the most sense for people who like this option. That way, it won't matter whether your iPhone is face up or face down: You'll always see a light flash for new alerts one way or another.

Display flash doesn't work like you might expect, especially if you've used LED flashes before. I thought my iPhone would flash a bright light on and off again a few times, mimicking how LED flash alerts works. Instead, when you get a new notification, the screen instantly increases the brightness for a few seconds, before lowering it again. It works—you're bound to notice your display jump in brightness if it isn't already maxed out—but it doesn't quite grab your attention as well as the LED flash.

How to set up Flash for Alerts on iPhone

To start, open the Settings app on your iPhone, then head to Accessibility. Scroll down to Hearing, then choose Audio & Visual. Scroll to the bottom of this page, then tap Flash for Alerts.

If you're running an older version of iOS, you'll only have the option to enable "LED Flash." However, those running iOS 26.2 and newer will also see an option for "Screen." Choose that option if you want the display to flash for new alerts, or "Both" to have both lights enabled.

You'll also find two choices that affect when these flash alerts go off, no matter which of the above options you pick. First, you can choose whether your iPhone will use flash alerts while locked. If you disable this option, you'll only see these light alerts when your iPhone is unlocked. Second, you can choose whether to use flash alerts in Silent Mode. I'd keep that setting enabled, since it seems most useful when your iPhone has no other way to alert you to new notifications.

It's also important to note that using an Apple Watch can complicate this feature a bit, at least in my experience. While giving this option a test, I had trouble getting alerts to come through on my locked phone without first going to my watch. If you have an Apple Watch, and its notifications mirror your iPhone's, you'll get the most out of this feature when your iPhone is unlocked.

☐ ☆ ✇ L H

These Two New iPhone Features Are Coming to iOS 26.3

By: Jake Peterson

Just three days after Apple released iOS 26.2 to iPhones everywhere, the company is back at it with a new update. iOS 26.3 is official, though only for beta testers. Those brave enough to install Apple's unfinished software on their devices won't find an update packed to the brim with new features and changes, but they will stumble upon two key new features. The thing is, we already knew both of them were on the way.

This isn't the end all be all for the update, however: Since iOS 26.3 is so new, it's possible testers will discover additional features hidden within the update. In addition, Apple may add new changes in subsequent beta versions. I'll continue to update this article to reflect any new features that reveal themselves, but, until then, here are the two new features we know about.

Notification forwarding

Back in September, we learned that Apple was quietly working on some type of notification forwarding feature, but other than that basic functionality, the details were left to speculation. At the time, the common assumption was that Apple intended the feature to be used to forward notifications to third-party devices, specifically smartwatches, in an attempt to open up the platform to wearables other than the Apple Watch. This wouldn't be Apple's choice, of course—left to its own devices, the company would keep as many features locked to Apple devices as possible. Instead, the motivation would come from the EU, which has compelled Apple to make its platforms more cooperative with third-party devices.

After three months, we are now getting our first official look at this feature. In this first iOS 26.3 beta, there is now a "Notification Forwarding" option in Notification settings. While the option isn't live at this time, Apple does have a description for how the feature works, saying that notifications can be forwarded to one device at a time. Importantly, the description says that when notifications are forwarded to another device, they will not appear on your Apple Watch. Is that limitation really necessary, Apple?

This Tweet is currently unavailable. It might be loading or has been removed.

Transfer to Android

Knowledge of iOS 26.3's second feature is not quite so old. In fact, we only learned about it last week. As it happens, Apple is working directly with Google on an official way to make transferring between an iPhone and an Android device more seamless.

As of last week, Google had already rolled out its first test of the feature to Android Canary, but it was nowhere to be found in Apple's betas. Now, we know what to expect: In iOS' "Transfer or Reset iPhone" settings, there is now a new "Transfer to Android" option. Here, iOS instructs you to place your iPhone near your Android device, where you can choose to pass along data like photos, messages, notes, and apps. However, it seems not all data will transfer: Health data, devices paired with Bluetooth, and "protected items" like locked notes will not come along with this transfer feature.

This Tweet is currently unavailable. It might be loading or has been removed.

Beware of running betas on your iPhone

This isn't the flashiest beta Apple has ever shipped, but it is possible to install right now. Both the developer and public betas are now available, which means anyone interested can enroll their device in Apple's beta program to give 26.3 a try.

However, know the risks before you do. Unfinished software could come with bugs and glitches that could impact your experience using your iPhone. If the software is particularly glitchy, you could lose data when downgrading back to iOS 26.2. If you do decide to install the beta, make a complete backup of your iPhone to a Mac or PC before you do.

☐ ☆ ✇ L H

I Tested Google’s New Live Translation With AirPods, and It Actually Works Well

By: Jake Peterson

I can be pretty tough on AI, especially when it's used to make misinformation slop. But as cynical as I may seem, I do acknowledge that there are plenty of useful and beneficial features that AI powers. Take live translation, for instance: Not long ago, the concept of a device that could translate someone else's words directly in your ear as they spoke would seem like some far future technology. But not only is it not a futuristic technology, both Google and Apple have their own takes on the feature that users can take advantage of.

That said, not all iPhone and Android users alike have been able to use live translate. Both companies have limited the feature to work with their respective earbuds: For Apple, that's the AirPods Pro 2 and AirPods Pro 3; for Google, that's the Pixel Buds. Without your platform's flagship earbuds, you haven't been able to use live translation, and instead need to stick with the rest of your translation app's experience, whether that be Apple Translate or Google Translate. Lucky for Android users, that's no longer the case for the latter.

On Friday, Google announced new Gemini translation capabilities for its proprietary translation app. The company says these new updates introduce "state-of-the-art text translation quality," with more nuanced, natural, and accurate translations. Importantly, however, as part of those upgrades, the company is launching a beta where all Google Translate users can access live translation through any headphones—not just Pixel Buds. This initial rollout is only available on the Android version of Google Translate in the U.S., Mexico, and India, though Google says the company will bring the feature to iOS and more regions in the next year.

This is kind of huge: Companies typically like to keep features like this locked behind their own platform as a marketing tactic. You're more likely to buy Pixel Buds over other earbuds or headphones if you really want to try live translation. However, you don't even need to buy a new pair of headphones to use this feature at all: As long as you have some type of headphones or earbuds connected to your Android device, you can translate conversations on the fly.

Trying Google's live translate with Apple headphones

I gave this a shot on my Pixel 8 Pro with my AirPods Max, by playing a video of people speaking Portuguese. Set up wasn't the simplest: First, it took forever for the Pixel to recognize my AirPods, despite the headphones being in pairing mode for some time, but that's beside the point. The key issue was getting Google Translate to present the new beta for live translation. When I first opened it, it was using the older live translate feature, which didn't work with my AirPods. I had the latest version running, so I uninstalled and reinstalled the app. When it launched, I didn't have live translate at all. Finally, after force quitting and reopening the app, I got a pop-up for the new live translation beta experience.

The next part was user error: I had my language set to the target language (Portuguese), and vice versa. As such, Google assumed I would be the one speaking Portuguese, and didn't vocalize the English translation. Once I flipped the languages, and confirmed that English would be spoken through my headphones, the feature started working—and well, for that matter. The video I choose was taken from a news broadcast, with two anchors, and various speakers during news segments. Once the video started, I could see Google Translation translating the words on my screen, and, after about four seconds, I heard the audio translated in my hear. Google Translate even tries to match the speaker's voice, and though it certainly isn't a deepfake, it does well enough to distinguish different speakers from one another. It even tried to take on more a serious tone to match the anchor's, versus the more casual tone of one of the people interviewed in a news segment.

I tried a couple of other videos in different languages, but this time, using the "Detect language" feature rather than a preset target language. The app was able to recognize this video was spoken in Thai, and this one was spoken in Urdu, and translated both accordingly. And while I can't verify the quality of the translation (I am sadly not fluent in any other language), the experience was overall easy enough to follow. The speed of speech can get a bit slow at times, perhaps because the AI has a lot to process at once, but as long as you turn up the volume on your headphones, it's easy enough to follow.

All that to say, I'm very interested to give this a try in a real world scenario. Even though my daily driver is an iPhone, I might need to start carrying around my Pixel 8 Pro just in case.

☐ ☆ ✇ L H

The Kindle App Now Has Built-In AI, Because of Course It Does

By: Jake Peterson

It's 2025, so every piece of technology now needs to have an AI component. It doesn't matter if these AI features are useful (though some are), they just need to be there, however ham-fisted or useless they may seem—though the line between those extremes often comes down to user preference. To that end, if you've ever been reading a book on the Kindle app and wished that you could ask your device a question about the text, Amazon has an AI bot for you.

Last week, Amazon announced "Ask this Book," a new AI feature for the Kindle app. Now available on the iOS version of the app, it lets you ask Amazon's AI questions about whatever it is you're reading, whether you bought or borrowed the title. You can highlight a selection from the text to include in you're queries, and ask questions relating the story's plot, characters, relationships, and theme. According to Amazon, all answers will be contextual, presumably meaning they'll all be related to the text at hand, and importantly, all answers will be spoiler-free. That should help avoid the classic mistake of googling a question you have about a book you're reading and spoiling a coming plot twist or character death.

Amazon says Ask this Book is currently active for "thousands" of books written in English. As noted, as of this writing the feature is only live in the iOS version of the app, but Amazon is working on bringing it to the Android app, as well as Kindle devices, next year.

Ask this Book, whether you like it or not

If this sounds like the type of feature you'd be interested in, great! If you don't care for this feature, either as a reader who doesn't want AI getting in the way of their books, a publisher who doesn't want Amazon training its AI on their IP, or a teacher who might see this as a potential cheating opportunity, there's bad news: Once Amazon makes Ask this Book available for any given title, it's permanently available, and there's nothing anyone can do about it. That comes directly from an Amazon spokesperson, who told Publishers Lunch, “[t]o ensure a consistent reading experience, the feature is always on, and there is no option for authors or publishers to opt titles out.”

That response bothers me for two reasons. One, it's always frustrating when a company introduces a new feature lwithout giving users the option to turn it off. I don't use Apple Intelligence, but I appreciate that Apple lets me turn it off. Meta, on the other hand, forces me to contend with Meta AI, even though I never use it. Amazon seems to be attending the Meta school of user design.

But what's more, it seems wild to me that authors and publishers don't get a say as to whether this AI bot gets to be active on their books—especially retroactively. It'd even be one thing if authors had to opt-in in order to put their books on the Kindle platform going forward. But to enable it on "thousands" of titles made available before Ask this Book was ever a thing is, to me, disrespectful to authors and publishers, to say the least.

Interestingly, Amazon dodged questions from Publishers Lunch concerning licensing rights around Ask this Book, as well as protections for users, which is troubling given generative AI has a habit of hallucinating—or, in other words, making things up completely. Sure, when it's working as intended, the AI can help readers understand things they're confused about, but there's a real chance that the AI will misinterpret questions, misrepresent the text, or straight up lie, which could negatively impact a reader's experience of the work, with potential fallout for both the author and the publisher.

How Kindle's Ask this Book works

While you won't see this feature yet on your Kindle, you will encounter it in the Kindle app. You can either access it from the menu in any book where the feature is available, or by highlighting text in said book. Once you do, Ask this Book will present a list of questions it thinks you might be interested in asking. If none of them do it for you, you can formulate your own questions, and ask followups after the bot answers.

☐ ☆ ✇ L H

If Some Photos Are Inexplicably Turning Red on Your iPhone, There's a Fix

By: Jake Peterson

If you open a picture in the Photos app on your iPhone, and it inexplicably starts turning red, I wouldn't blame you for being a bit concerned. After all, that's not supposed to happen, and out of all the colors your photos could randomly fade into, red is among the creepiest.

While you contemplate what angry and vengeful god you might have crossed recently, understand that this isn't necessarily a problem affecting all, or even most, iPhone users and their photos. In fact, it doesn't appear to be affecting photos taken on iPhones at all. Rather, the users reporting this issue see it when zooming in on photos taken on Android devices. It seems a new hue has been added to the iPhone/Android divide: green bubbles, red pictures.

If this isn't happening on your own iPhone, you can see the issue play out in this Reddit post. User djenki0119 posted a screen recording of themself browsing photos on their iPhone that they had originally taken on a Samsung Galaxy S24. At first, the pictures appears totally normal. But once djenki0119 zooms in on each, it quickly turns a deep shade of red—almost as if you were looking at film developing in a dark room. This user has the same issue, only they took their photo on a Motorola Razr.

At this time, it's unclear what is actually causing the issue to occur. It usually doesn't matter what type of device took any particular image: Once it's in the Photos app, it should display normally. But there must be something about Android files that the iOS Photos app isn't reading correctly, at least when users zoom in on the image. As 9to5Mac highlights, it appears that something is adding a red filter to these images in the Photos app. Since this issue is only popping up recently, my guess is there's a bug within iOS 26, though there could be an issue with Android instead.

For what it's worth, I wasn't able to replicate the problem with photos I sent from my Pixel 8 Pro over to my iPhone. But perhaps there is some strange combination of hardware and software that results in this tinting: Maybe a photo taken on a certain type of Android device running a specific version of Android turns red on a certain iPhone model running a specific version of iOS.

How to undo a photo that turned red on iPhone

Luckily, you don't have to wait for Apple, Google, Samsung, or Motorola to issue a fix, depending on where the actual issue is coming from. To return your image to its proper color scheme, open it in the Photos app, tap "Edit," then choose "Revert." This restores the image to its original state, and removes the red filter that was unnecessarily overlayed on top of it.

❌