As Apple prepares to extend its apps platform into the intimate world of Spatial Computing with Apple Vision Pro and at the same time into what it’s calling a more socially-connected FaceTime experience, the company is facing two apparent competitive threats that also represent two opposite extremes of influence.
The first threat relates to Artificial Intelligence and the second to Apple’s App Store. They might not seem related, but as media narratives they seem to be.
Both come from the same kind of thinkers who not so long ago imagined Apple would be driven out of business because it wasn’t investing all of its resources into Voice First smart microphones, and then into folding displays that promised to turn a thick iPhone into a creased iPad. Why isn’t Apple chasing all the mistakes of others?
At the same time, these same thinkers were also insisting that world governments should all force the company to manufacture replaceable battery packs like Nokia did back before phones were waterproof, to license Adobe Flash on its mobiles, to adopt mini-USB, and to provide free tech support for any sort of counterfeit components users might install inside their devices. Shouldn’t the bureaucrats who can barely balance their own budgets take over Apple’s engineering?
The first developer beta of macOS Sonoma 14.4 is out for testing following the release of version 14.3.
macOS Sonoma
Developers taking part in the beta test program can pick up the newest build through the Apple Developer Center or by updating Macs already running the beta. Public beta versions are generally made available via the Apple Beta Software Program not long after the developer versions are released.
The new beta round takes place after the public release of macOS Sonoma 14.3 on January 22, and the release candidate from January 17.
With the release of watchOS 10.3, Apple has shifted generations and has provided its first developer beta test for watchOS 10.4.
Developers can download the new watchOS beta
Developers participating in the beta program can pull the latest version via the Apple Developer Center or simply by updating their devices with the beta software. Public beta releases are usually available a brief period after the developer releases, via the Apple Beta Software Program.
The first watchOS 10.4 developer beta arrives after the public release of watchOS 10.3, which occurred on January 22, and its release candidate from January 17.
Apple’s first-quarter results should provide modest upside’ for investors, a note from Evercore analysts believes, with continued high Services growth and risk in China at the forefront of investor thoughts for the company.
Apple Park
Apple’s holiday quarter earnings will be announced on February 1, and will detail the finances of the iPhone maker during its typically busy period. In a pre-results note to investors seen by AppleInsider, the results should be the start of an improved outlook for Apple throughout the rest of the year.
The note forecasts that Apple’s revenue will reach $117 billion, the same ballpark as the year-ago quarter, and slightly below its consensus estimate of $118 billion. For Earnings Per Share, Evercore puts the figure at $2.08, up year-on-year from $1.88, and just below the $2.10 consensus.
Apple has attacked what it calls the UK’s “unprecedented overreach” in proposing that it have the power of veto over all Big Tech security features across the globe.
UK Houses of Parliament
The UK’s House of Lords is due to debate an update to the country’s Investigatory Powers Act (IPA) 2016 on January 30, 2024. In a much earlier form in 2015, the IPA was slammed by Apple for how it then proposed breaking encryption.
According to BBC News, Apple is now attacking the latest update proposals. Apple is against the UK having a veto over security updates, and also over how if the country were to exercise that veto, no Big Tech firm could even say that it has.
Sony’s PlayStation division has cooked up its first State of Play event for 2024, which will stream this Wednesday at 5PM ET. The company promises a runtime of 40 minutes and coverage of more than 15 upcoming titles.
To that end, Sony says two of the games profiled will be Stellar Blade and Rise of the Ronin. Stellar Blade, formerly called Project Eve, has been on our radar for a while, and it’s been around 18 months since we got an update. The PS5-exclusive was supposed to hit store shelves in 2023, so we’re due for a release date and another trailer that shows off more footage of the forthcoming action RPG. For those keeping score, the first teaser trailer for Stellar Blade appeared way back in 2019.
As for Rise of the Ronin, it’s a historical action RPG from Team Ninja, the developer behind Nioh. The game’s set in 1863, during Japan’s Bakumatsu era, and you play as a wandering Ronin. Expect plenty of third-person melee combat and gorgeous visuals. You’ll be able to get your hands on it on March 22, so expect some sort of final trailer.
Those are the only two confirmed games that’ll get the spotlight during this week’s stream, leaving more than 13 unknowns. There have been rumors swirling around the internet throughout the weekend regarding what else will be on the docket. These leaks suggest the stream will also feature Death Stranding 2, Final Fantasy 7 Rebirth, a remaster of Sonic Generations, a remake of Silent Hill 2 and a new Metro game, among others. Like all leaks, take this information with a grain of salt. However, the original leaker did nail the date of the event, so there’s that.
You can watch via the official PlayStation site. It’ll also be available on the company’s YouTube, TikTok and Twitch channels. Sony promises information on both PS5 exclusives and upcoming PS VR 2 games.
This article originally appeared on Engadget at https://www.engadget.com/the-first-playstation-state-of-play-of-2024-will-stream-this-wednesday-at-5pm-et-192534173.html?src=rss
For nearly a decade the Galaxy Note was the undisputed king of Android phones. But when the OG phablet line was retired in 2020, that title passed on to the Ultra. While the hardware inside the most expensive Galaxy S model is as dominant as ever, over the past few years, the software in Google phones has begun to outshine anything available from Samsung. But armed with a new suite of AI-powered features, the Galaxy S24 Ultra (S24U) got exactly what it needed to maintain its spot atop the Android battlefield.
Design and display: Now with titanium
There are three main areas of improvement to the S24 Ultra: design, cameras and all of Samsung’s new AI tools. The biggest change to its build is the switch to a new titanium frame, which follows what Apple did for the iPhone 15 Pro last fall. So no points for originality. But more importantly, because the previous Ultra featured an aluminum chassis, there’s not a major change in weight either, with the S24U coming in at 232 grams (just two grams lighter than the S23 Ultra).
Some other subtle changes are a new matte finish and an upgrade to Corning’s Gorilla Armor in front and back (instead of Gorilla Glass Victus 2 like on the regular S24/S24+). Another benefit of Corning’s latest hardened glass is that it has improved anti-reflective properties, so while it doesn’t totally eliminate glare, it does make it appear less harsh without impacting the display’s color saturation. And despite the previous model having slim bezels, Samsung reduced the borders around the display again by 42 percent, which is most noticeable along the top and bottom.
The display itself proves, once again, that Samsung makes the best mobile screens on the market. You still get a 6.8-inch OLED panel with a variable 120Hz rate, except now it’s even brighter with a peak of 2,600 nits (up from 1,750 nits). And if that’s not enough, the phone’s improved Vision Booster adds an additional 300 nits of perceived brightness, so movies, games, and everything else always looks good no matter where you are.
Performance: Setting a new bar for speed
Photo by Sam Rutherford/Engadget
Inside, the S24 Ultra features a new Snapdragon 8 Gen 3 SoC from Qualcomm and it’s a powerhouse. In Geekebench 6, we saw multi-core scores that were 30 to 35 percent higher than last year’s chip. This makes everything from games to switching between apps feel super snappy. The addition of a 92 percent larger vapor chamber also meant the S24 UItra never got above lukewarm even under sustained loads. Samsung also increased the phone’s RAM to 12GB for every config, unlike the S23U which started at 8GB base. Storage remains the same with 256GB, 512GB and 1TB options.
Cameras: A more usable 5x optical zoom
Three of the S24 Ultra’s four cameras are largely unchanged from its predecessor, including its 200-MP main sensor, 10-MP ultra-wide and 12-MP telephoto shooter with a 3x optical zoom. The main upgrade is swapping out the old 10x lens for a 5x optical zoom with a higher-res 50-MP sensor, which Samsung says reflects 5x being the most widely used focal length aside from the main cam. While this move might seem like a loss in terms of reach, the sensor’s increased resolution allows the phone to crop in providing what Samsung calls a “10x optical quality zoom” that’s surprisingly sharp.
In photos of the World Trade Center and the Statue of Liberty from across the water, the S24U produced rich, detailed pics that were just as good as what we got from a Pixel 8 Pro. And while images taken at 10X were a touch softer than similar shots taken by an S23 Ultra, they weren’t far off.
In general the S24U captured gorgeous pics in all sorts of conditions. You’ll still notice Samsung’s super-saturated colors and penchant for slightly warmer hues, but in most cases that just adds an extra sense of vibrancy. Samsung also has a habit of going a bit overboard on sharpening, though it’s not a major distraction. Even in low light the S24U’s Night Mode largely kept up with Google’s Night Sight, which is no small feat.
Software: Samsung’s big push into AI
Photo by Sam Rutherford/Engadget
Aside from its new hardware, the biggest addition to the S24 Ultra is Samsung’s Galaxy AI features, which are an entire suite of tools that fall into three main categories: text and translation, photography and editing, and search.
There’s an interpreter mode for in-person conversations along with a live translation feature that you can use during calls. Both are good enough to use in a pinch while traveling, but some things like word choice and pacing may be a bit off. The experience can also feel a bit clunky, especially when you’re on the phone and have to wait so the AI can catch up.
Next, you have Chat Assist which can check spelling, grammar andadjust the tone of messages. Admittedly, the social and emojify options are a bit gimmicky, but I genuinely appreciate the polite and professional choices, as they can help prevent a text or email from sounding combative.
Photo by Sam Rutherford/Engadget
In the Notes app, the S24U can also summarize, auto-format, spellcheck or translate a file, which is nice, but not exactly groundbreaking. A lot of these features are already available from other services like ChatGPT or Bard. That said, these improvements may be the biggest upgrades to the S24 Ultra’s S-Pen, which is otherwise largely unchanged.
Out of Samsung’s text-based tools, my favorite is the transcription feature in the Voice Recorder app. It makes grabbing quotes from interviews super simple, though I noticed that Samsung’s UX doesn’t feel quite as polished or streamlined as what you get from Google. For example, the Pixel Recorder lets you see the transcript in real-time, while on the Ultra, you have to record a convo and then hit the AI icon to generate a chat log when you’re done.
Photo by Sam Rutherford/Engadget
The AI can also suggest edits for images like automatically remastering images (which is similar to the Auto Tone feature in Photoshop) or removing distracting elements like shadows and reflections. You can see these options by hitting the Info icon in the gallery app, which makes them super easy to access and might be the fastest way to improve your photos. The S24 Ultra can also create slow-mo clips from existing footage, just by tapping and holding on a video while it’s playing. This triggers the phone’s AI to generate new frames based on the fps of the recording (i.e. from 30 fps to 120 fps) on the fly and the results are surprisingly smooth.
If you prefer a more hands-on approach, there are Generative AI edits that allow you to reframe shots, move subjects around or delete them entirely, while the phone fills in the blanks. It’s a simple but effective process that sidesteps the need for Photoshop in a lot of cases. That said, if you look close you may notice areas where Samsung’s AI misses more details than the Pixel 8’s Magic Editor, which is a trend I noticed across a lot of Samsung’s AI features.
Photo by Sam Rutherford/Engadget
All of the new tools generally function as expected, but things don’t feel quite as streamlined or polished as a lot of Google’s alternatives. In the Notes app, there’s a word limit for auto-formatting, summarizing and more, which limits you to about three or four paragraphs at a time. That means if you have a medium-sized doc, you’re gonna have to tackle it in chunks, which gets tedious pretty quickly. And sometimes if you try to highlight areas of a photo to remove reflections, the phone will smooth over the entire area and paint over the details.
In other situations, the AI will suggest edits that don’t make sense, like trying to turn a short motion photo into a timelapse. It’s possible this was my fault for importing a photo taken by another device, but I feel like the phone ought to know better. The AI is meant to work on any photo, regardless of where it came from. Even moving subjects around in a pic can get wonky depending on the shot and what you’re trying to do. And every now and then, the phone will suggest you remaster a photo, only for it to tell you that there’s nothing to fix. As a photographer, that’s a great feeling. But at the same time, why am I being told there are things to fix if that’s not actually the case? But, this is Samsung’s first big push into AI-assisted features, so it shouldn’t be a shock to see a handful of hiccups.
Photo by Sam Rutherford/Engadget
Rounding out the S24’s kit is Circle to Search, which is the one new AI feature that relies on help from the cloud instead of taking place on-device. It’s essentially a combination of traditional text-based queries and visual search tools like Google Lens but without the need for a standalone app. The neat thing is that it can analyze images from the web or objects in photos you’ve taken yourself, which makes it pretty versatile. But Google recently announced that Circle to Search is coming to Pixel phones too, so it’s not like this is an exclusive feather in Samsung’s cap.
Battery Life: Nearing two days of juice
Between the power efficiency gains from its new processor and a large 5,000 mAh battery, the S24 Ultra delivered truly impressive longevity. On our local video rundown test, it lasted 24 hours and 19 minutes, which is up more than four hours compared to last year. And in the real world, its battery life was even more impressive. The S24U often had more than 50 percent left after 24 hours. So depending on your usage, it’s possible for this phone to last two days without recharging.
Wrap-up
At this point, you’d be forgiven for being fed up with companies trying to push AI into everything. But if you just think about these as software upgrades meant to make your phone more useful, Samsung’s push into machine learning makes a lot more sense. The S23U was already a great phone and on the S24 Ultra, we’re getting the same (though somewhat plain) design, but with a tougher titanium frame, a much faster chip, a brighter display and even longer battery life. Samsung also tweaked its main telephoto lens to provide a more useful focal length but without a major decrease in reach or quality.
Photo by Sam Rutherford/Engadget
But the big thing is that, with its Galaxy AI suite, Samsung finally has an answer to the sophisticated features that were previously only available from the Pixel family. Sure, the S24’s tools aren’t quite as polished as Google’s offerings, but they get you 80 to 90 percent of the way there. And as a complement to what is more or less a top-to-bottom list of best-in-class smartphone hardware, it feels like Samsung is using AI to shore up one of the few remaining weaknesses of its flagship handset. Particularly now that the company is following in Google’s footsteps and increasing software support from six to seven years of OS and security updates.
However, the Ultra’s biggest sticking point — its price — remains an issue. With the S24U starting at $1,300, it costs $100 more than the outgoing model. I’m also disappointed that Samsung didn’t adopt Qi 2. It’s frustrating to see all the major OEMs, including Apple, agree on a wireless charging standard only to have the biggest phone maker in the world drag its feet. Qi 2 got approved last year and we may not see it on a high-end Samsung handset until 2025.
While harnessing AI might not be a super exciting development now that everyone and their grandmother is trying to shoehorn it into everything, it does make the S24 Ultra a more powerful and well-rounded handset. And when you tack that onto a phone that already had a lead in hardware, you end with a pretty commanding device.
This article originally appeared on Engadget at https://www.engadget.com/galaxy-s24-ultra-review-samsungs-ai-reinforcements-have-arrived-specs-price-191508062.html?src=rss
The Apple Vision Pro will be missing some major native apps at the outset, including Netflix, Spotify and YouTube. One notable app to which users of the mixed-reality headset will have access when it debuts later this week is Zoom, which will support the Vision Pro’s Persona feature.
Vision Pro users will be able to create digital versions of themselves. If you have said Persona, others on a Zoom or FaceTime call will be able to see your facial expressions and hand movements via your avatar. So while you may not be using a traditional webcam, other folks might notice your persona cringing at one of your boss’ bad jokes.
According to Zoom, the app’s spatial experience can be “scaled to the perfect size,” so it shouldn’t seem like you’re miles away from someone’s Persona. Although Vision Pro users will be represented as a Persona (if they choose to be), those joining the call from other devices will be represented as a floating tile.
Zoom will be one of the first major third-party apps to use this tech. Apple said Microsoft Teams and Cisco Webex are getting in on the party too. The company claims that it only takes a few minutes to set up a Persona with a Vision Pro.
There are more features coming to Zoom’s app this spring. You’ll be able to share 3D object files and view these in a virtual space through Vision Pro. Team Chat is also coming to the app, as is a tool called real-world pinning. Zoom says you’ll be able to use this to pin five meeting participants anywhere in the virtual space and have the option of removing their background. The company suggests this will help Vision Pro users “feel more connected to the people in the meeting.”
While Zoom might not be the most exciting app for those who are picking up a Vision Pro primarily for entertainment purposes, it’s interesting to see what third-party companies are starting to do with the tech. A Zoom call might not be too much different from a FaceTime chat out of the gate, but the addition of features like 3D object sharing could make it a more intriguing prospect for mixed-reality use.
This article originally appeared on Engadget at https://www.engadget.com/zooms-apple-vision-pro-app-will-let-people-see-your-facial-expressions-via-an-avatar-184536273.html?src=rss
Zoom’s Apple Vision Pro app will let people see your facial expressions via an avatar
Originally appeared here:
Zoom’s Apple Vision Pro app will let people see your facial expressions via an avatar
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie
Duration
Description
cookielawinfo-checkbox-analytics
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional
11 months
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance
11 months
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy
11 months
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.