Apple has officially unveiled the iPhone 16e, a budget-friendly model replacing the aging iPhone SE. However, what sets this model apart is the introduction of the C1 cellular modem, Apple’s first in-house modem chip designed for iPhones.
While specific details are scarce, Apple emphasizes that the C1 chip is the most power-efficient modem ever used in an iPhone, ensuring fast, reliable 5G connectivity. This focus on power efficiency is noteworthy, especially for a budget device like the iPhone 16e. By consuming less power, Apple could improve the phone’s battery life without needing larger batteries. As 5G and AI-powered features continue to dominate smartphones, this could lead to better overall performance and longer usage times.
The C1 modem could significantly enhance the battery efficiency of the iPhone 16e, potentially extending its life while running demanding applications like AI and 5G. While iPhones have solid battery life, they’ve rarely been at the top in endurance. The C1 chip could change that by reducing power consumption, making the iPhone 16e more efficient than its predecessors.
In addition to the C1 modem, Apple’s use of the A18 chip in the iPhone 16e has led to a redesigned internal architecture. Apple hasn’t provided full details, but this new component packaging could optimize space and improve thermal management, potentially leading to a thinner, more efficient phone.
There’s speculation that this internal redesign could also set the stage for future hardware advancements. One possibility is a dedicated GPU separate from the A18 chip, which could elevate the iPhone’s performance in gaming and visually demanding tasks, further improving the user experience.
By designing its own components, Apple can fine-tune performance and efficiency more effectively. While this is just the beginning, it could lead to more customized solutions for future iPhones, distinguishing Apple devices from the competition.
Meta has unveiled a significant update to Facebook Live that will affect how long videos remain available on the platform. Starting February 19, 2025, all Facebook Live videos will only be accessible for 30 days before they are automatically deleted.
This change is part of Meta's broader effort to reduce storage costs and optimize the management of video data. The company explained that most users watch Facebook Live videos within the first few weeks after they are broadcast, which is why the decision was made to limit the storage duration of these videos.
>>>SX18650-1S2P for Verifone X970 X990
Previously, Facebook Live videos were kept on the platform indefinitely, but Meta noticed that older live streams were seldom viewed. As part of its new strategy, the company will begin removing all Facebook Live videos older than 30 days in stages over the coming months.
Meta's decision is largely driven by the high costs associated with storing videos, especially as the company plans to invest $65 billion in AI development this year. By clearing out older content, Meta can free up storage space and reduce operational expenses.
For users with older live streams, Meta will send notifications and provide a 90-day window to download or save the videos before they are permanently deleted. Users will also have the option to extend the deletion period by an additional six months, should they need more time to act.
To encourage users to keep their content on the platform, Meta is suggesting that creators and businesses turn their old Facebook Live videos into Reels. This will allow videos to remain on the platform longer while taking advantage of Reels' newer format and increased engagement potential.
>>>BL-17UV for Baofeng UV-17L UV-17M
For creators and businesses that rely on Facebook Live for content, this update serves as a reminder to save any important videos before they are deleted. If you have significant or archived livestreams, now is the time to review your content and either download or repurpose it into another format, such as Reels, to ensure it remains accessible.
Nvidia has made a significant shift in its support for legacy technologies with the announcement that 32-bit implementations of PhysX will no longer be supported on its new RTX 50 series graphics cards. This decision effectively removes 32-bit CUDA application support on the latest GPUs, marking the end of an era for some older games that rely on this technology.
For gamers who enjoy titles from the 2000s and early 2010s, this change means losing some of the advanced particle and clothing effects that were originally powered by PhysX. Iconic games like Mirror's Edge, Mafia II, and Batman: Arkham City will be affected, and while there's a quick workaround, the move has left many wondering about the future of older game performance on modern hardware.
>>>BAT-EDA40K for Honeywell BAT-EDA40K
PhysX is Nvidia's physics engine, responsible for creating dynamic, realistic effects like cloth simulation and particle systems in games. When 32-bit CUDA applications were phased out on the RTX 50 series cards, users immediately noticed the effects in classic games that depend on this engine.
As highlighted on Nvidia’s forums (via PCGamesN), one user encountered issues when trying to run PhysX with their RTX 5090. PhysX attempted to use hardware acceleration through the CPU instead of the GPU, significantly impacting performance. Nvidia staff responded, confirming that the issue stemmed from the deprecation of 32-bit CUDA support on the new RTX 50 series GPUs.
The performance loss can be seen in some older titles like Borderlands 2, where the implementation of PhysX is used to enhance environmental dynamics, such as adding cloth textures and dust particles. One Reddit user shared their experience, reporting a significant drop in frame rates when trying to run PhysX on their RTX 5090 FE paired with an AMD Ryzen 7 9800X3D processor. The frame rate dipped to around 60 fps, showcasing the impact of this change.
While turning off PhysX does make the game feel more static and less immersive, it does not render the game unplayable. However, for some games, like Borderlands 2, there currently isn't an easy way to disable PhysX, forcing users to rely on CPU-based physics acceleration.
A major list of games that rely on 32-bit PhysX support was posted on ResetEra, with several popular titles on the list. These include Metro: Last Light, Assassin's Creed: Black Flag, and Tom Clancy’s Ghost Recon Advanced Warfighter 2. While the technology had some notable implementations, such as in Batman: Arkham City and Mirror's Edge, its performance was often criticized. The reliance on Nvidia-only PhysX cards and the significant work required to make it run meant that it was always a niche feature, rather than a standard.
>>>954642 for Hikvision 3-BT-V1.1
If you're one of the lucky few with an RTX 5080 or RTX 5090, there are three major ways to work around the lack of 32-bit PhysX support:
The removal of support for 32-bit applications on Nvidia cards has been a long time coming. While this is one of the bigger losses for gamers who enjoy classic titles with advanced PhysX features, it’s part of a broader trend toward more modern, efficient architectures. Nvidia has been gradually phasing out legacy support as it moves towards more cutting-edge technologies.
Despite the transition, the community is often quick to come up with solutions. So, while official support is ending, there’s hope that third-party fixes will surface to keep older games running smoothly on new hardware.
Google has recently begun enforcing new tracking rules across connected devices like smartphones, consoles, and smart TVs, according to a report by BBC. These changes come after Google, back in 2019, criticized fingerprint tracking as “wrong,” but now, the tech giant has decided to reintroduce this controversial technique.
While Google has acknowledged that other companies broadly use fingerprint tracking data, it officially implemented the changes on February 16, 2024. Despite the company’s defense, fingerprinting—which collects data about a device's hardware and software—can uniquely identify individual devices or users, sparking significant privacy concerns.
Privacy advocates have swiftly criticized Google’s move, calling the new tracking rules “a blatant disregard for user privacy.” Mozilla’s Martin Thomson emphasized the risks, warning that Google’s actions grant both the company and the advertising industry it dominates the ability to track users in ways that are nearly impossible for individuals to prevent.
The new rules, while framed as privacy-enhancing technologies by Google, have raised alarms among privacy campaigners. Google claims these tools allow advertisers to succeed on emerging platforms without compromising user privacy, but many question how these technologies might undermine personal security.
Fingerprinting refers to the technique of collecting detailed information about a user's browser and device to create a unique profile. This method gathers a wide range of data, such as screen size, language settings, battery level, time zone, and browser type, which helps advertisers target specific ads to users.
However, unlike traditional tracking methods, which depend on cookies that users can opt out of, fingerprinting is far harder to block. This leaves users with less control over how their data is gathered and used, resulting in greater concerns about privacy violations.
Initially, when Google first announced these new tracking features in December 2023, there was little pushback. However, as the rules began to take effect, criticism has mounted. Google’s reliance on fingerprinting to overcome issues with ad targeting on devices like smart TVs and gaming consoles, where cookie consent mechanisms are limited or non-existent, has raised questions about the ethical implications of using such methods across a wide range of platforms.
>>>GLU7G for Google Pixel GLU7G
The reintroduction of fingerprint tracking by Google signals a shift in the company's approach to user privacy. While Google asserts that the new technology is designed to respect privacy while enhancing advertising capabilities, many remain skeptical. As the tech industry grapples with the balance between monetizing user data and safeguarding personal information, fingerprint tracking may become a focal point in the ongoing debate about privacy rights in the digital age.
After months of leaks and speculation, Apple is set to announce its fourth-generation iPhone SE later this week. The new version of Apple’s budget iPhone will feature several innovative updates, including the addition of an OLED display and Face ID—two firsts for the iPhone SE series.
One of the most significant changes in the iPhone SE 4 is the inclusion of Apple’s first-ever in-house 5G modem, which will be manufactured by TSMC (Taiwan Semiconductor Manufacturing Company). This marks a major shift in strategy, as Apple has traditionally relied on external chipmakers like Intel and Qualcomm for 5G modems.
This move is part of Apple’s broader goal to become more self-reliant in terms of hardware components, reducing its dependence on external suppliers and bringing more components in-house. However, industry reports indicate that Apple’s first-generation modem may not be as powerful or efficient as Qualcomm’s Snapdragon X75.
A report from South Korea has shed light on the performance differences between Apple's in-house modem and Qualcomm’s flagship Snapdragon X75. According to the report:
This means the iPhone SE 4 could offer slower data speeds compared to the iPhone 16 series, which will feature Qualcomm’s more advanced modem.
The iPhone SE 4 will serve as a testing ground for Apple’s self-developed modem, and it is expected to feature some important characteristics, such as Dual SIM Dual Standby and deep integration with Apple-designed processors. These features will likely help improve battery life and overall system efficiency, despite the modem’s lower performance compared to Qualcomm’s offerings.