Tag: technews

  • In 2024, using social media felt worse than ever

    Cheyenne MacDonald

    It’s never been more exhausting to be online than in 2024. While it’s been clear for some time that monetization has shifted social media into a different beast, this year in particular felt like a tipping point. Faced with the endless streams of content that’s formulated to trap viewers’ gazes, shoppable ads at every turn, AI and the unrelenting opinions of strangers, it struck me recently that despite my habitual use of these apps, I’m not actually having fun on any of them anymore.

    Take Instagram. I open the app and I’m greeted by an ad for bidets. I start scrolling. Between each of the first three posts at the top of my feed is a different ad: lingerie, squat-friendly jorts, shoes from a brand selling items that appear to be dropshipped from AliExpress at a markup. Then, thankfully, two memes back to back. I fire off the funny one to five of my friends in a way that feels obligatory. After that, another ad, then a bunch of seemingly off-target Reels from accounts I don’t even follow. Minutes pass before I encounter a post by someone I know in real life. Oh yeah, it’s time to turn off suggested posts again, something I have to do every 30 days or my feed will be filled with random crap.

    But before I get a chance to do that, I’m distracted by a Reel of a cat watching The Grinch. Then by a Reel of a guy with a tiny chihuahua in his coat pocket. Curiosity gets the better of me and I open the comments, where people are angrily writing that the dog must be suffocating. Oh no. I scroll to the next Reel, a video I’ve seen several times before of a rooster marching around in a pair of pants. Below, everyone’s fighting about whether it’s cruel to put pants on a chicken. Is it? Next, a video of a girl doing her makeup, where men are commenting that this should be considered catfishing. Deep sigh. I realize 30 minutes have somehow passed and I close Instagram, now in a worse mood than when I opened it. I’ll compulsively return in an hour or so, rinse and repeat.

    It’s not just an Instagram problem. On TikTok (which may or may not get shut down in the US very soon), the For You page has me figured out pretty well contentwise and the presence of toxic commenters is minimal, but every other post is either sponsored or hawking a product from the TikTok Shop. And it’s too easy to get sucked into the perpetual scroll. I often avoid opening the app at all just because I know I’ll end up getting trapped there for longer than I want to, watching videos about nothing made by people I don’t know and never will. But it still happens more frequently than I’d like to admit.

    These days, it feels like every gathering place on the internet is so crowded with content that’s competing for — and successfully grabbing — our attention or trying to sell us something that there’s barely any room for the “social” element of social media. Instead, we’re pushed into separate corners to stare at the glowing boxes in our hands alone.

    Fittingly, Oxford announced at the end of November that its Word of the Year for 2024 is “brain rot,” a term that expresses the supposed consequence of countless hours spent on the internet consuming stupid stuff. Just as fitting, Australia’s Macquarie Dictionary chose “enshittification,” which describes how the platforms and products we love get ruined over time as the companies behind them chase profits. (It was also The American Dialect Society’s 2023 Word of the Year). Social media platforms were in theory designed around ideas of friendship and connection, but what’s playing out on them today couldn’t feel further from genuine human interaction.

    Facebook — if you even have an account still — might be where you’d go if you really wanted to see updates from family and other people you know IRL, but its UI has become so cluttered with recommended Reels and products that it feels unusable. Twitter, where it was once fun to keep up with live discourse around major events or fandom happenings, no longer exists, and X, its new form under Elon Musk, is filled with bots and political propaganda.

    On the other hand, Threads, an offshoot of Instagram and Meta’s answer to Twitter/X, took off this year and it quickly became a hotspot for copy-paste engagement bait, a problem so bad that Adam Mosseri has publicly acknowledged it. The Threads team has apparently been “working to get it under control,” but I still can’t scroll through my For You feed without seeing a dozen posts that are either just regurgitated memes being passed off as original thoughts, or questions to the masses that are crafted with the intention of stirring the pot. The same feed is otherwise dominated by viral videos that are ripped off from other creators without credit and pop culture commentary that almost always devolves into sex- and genderism. I often step away from Threads feeling the need to go scream in a field.

    Threads doesn’t have DMs, meaning all conversations take place in public. It finally gave users the ability to create custom feeds around searchable topics in November, but those topic pages are generally still riddled with bait-style posts, just more subject-specific versions. That’s meant so far that it’s been pretty hard to find communities to authentically connect with. It all feels so impersonal.

    It doesn’t help that Threads’ Following feed currently isn’t the default view and there’s no way to change that (though Threads recently began testing the option). And at the end of the day, its 275 million or so monthly active users doesn’t include all that many people I actually know, especially outside of the media industry. The same goes for fediverse social networks like Mastodon and Bluesky, which are far less populated but have a cliquier feel. Visiting those platforms feels like walking into a room full of people who all know each other really well, and realizing you’re the odd one out. But at least Bluesky nor Mastodon aren’t poorly veiled shopping experiences. (Threads isn’t at the moment, either, but ads are reportedly coming).

    Maybe it all comes down to burnout in the era of excessive consumption, but lately I’ve found myself wishing for a place on the internet that feels both inviting and human. I’m sure I’m not alone. In recent years, we’ve seen alternative social apps pop up like BeReal, Hive and the Myspace-reminiscent entrants SpaceHey and noplace, all aiming to bring character and interpersonal connection back into social media. But none have quite cracked the code for lasting mainstream adoption. Discord and even Reddit to some extent address the same person-to-person need, yet they share more in common with proto social media chatrooms and forums than with the sites that sprung up during the social heyday.

    Meanwhile, Meta is increasingly pushing AI across its apps. Just this summer we got the chatbot-maker, AI Studio, which Meta touted not only as a way for users to create AI characters, but for “creators to build an AI as an extension of themselves to reach more fans.” Rather than talk to your real friends or make new ones around a common interest, you can deepen your parasocial relationship with celebrities, influencers and fictional characters by chatting with the AI versions of them. Or, pick from several AI girlfriends you can now find in the menu of your DMs. We’ve completely lost the plot, I fear.

    I’ve started dipping back into Tumblr here and there, if only to see a less chaotic, more curated feed and relish in the reminder of how fun customization can be. A few friends have mentioned that they’ve been doing the same. But given the platform’s past policy upheavals and its current AI partnerships, it’s not exactly an online oasis either. As if on cue, I was recently served a mock Tumblr poster during my evening scroll that felt uncannily apt: “we didn’t get better. the rest of the internet just got worse.”

    This article originally appeared on Engadget at https://www.engadget.com/social-media/in-2024-using-social-media-felt-worse-than-ever-170047895.html?src=rss

    Go Here to Read this Fast! In 2024, using social media felt worse than ever

    Originally appeared here:
    In 2024, using social media felt worse than ever

  • OpenAI’s for-profit plan includes a public benefit corporation

    Igor Bonifacic

    Following months of speculation, OpenAI has finally shared how it plans to become a for-profit company. In a blog post penned by its board of directors, OpenAI said Thursday it plans to transform its for-profit arm into a Public Benefit Corporation sometime in 2025. PBCs or B Corps are for-profit organizations that attempt to balance the interests of their stakeholders while making a positive impact on society.

    “As we enter 2025, we will have to become more than a lab and a startup — we have to become an enduring company,” OpenAI said, adding that many of its competitors are registered as PBCs, including Anthropic and even Elon Musk’s own xAI. “[The move] would enable us to raise the necessary capital with conventional terms like others in this space.”

    As part of the transformation, OpenAI’s nonprofit division would retain a stake in the for-profit unit in the form of shares “at a fair valuation determined by independent financial advisors,” but would lose direct oversight of the company. “Our plan would result in one of the best resourced non-profits in history,” claims OpenAI.

    Following the reorganization, the for-profit division would be responsible for overseeing OpenAI’s “operations and business,” while the nonprofit arm would operate separately with its own leadership team and a focus on charitable efforts in health care, education and science.

    OpenAI did not state whether CEO Sam Altman would receive an equity stake as part of the restructuring. Last year, OpenAI’s board of directors briefly fired Altman before bringing him back, in the process sparking the institutional crisis that led to this week’s announcement. According to some estimates, OpenAI’s for-profit arm could be worth as much as $150 billion. In 2019, OpenAI estimated it would need to raise at least $10 billion to build artificial general intelligence. In October, the company secured $6 billion in new funding.

    “The hundreds⁠ of billions⁠ of⁠ dollars that major companies are now investing into AI development show what it will really take for OpenAI to continue pursuing the mission,” OpenAI said. “We once again need to raise more capital than we’d imagined. Investors want to back us but, at this scale of capital, need conventional equity and less structural bespokeness.”

    Despite this week’s announcement, OpenAI is likely to face multiple roadblocks in implementing its plan. In addition to its ongoing legal feud with Elon Musk, Meta recently sent a letter to California’s attorney general urging him to stop OpenAI from converting to a for-profit company, saying the move would be “wrong” and “could lead to a proliferation of similar start-up ventures that are notionally charitable until they are potentially profitable.”

    This article originally appeared on Engadget at https://www.engadget.com/ai/openais-for-profit-plan-includes-a-public-benefit-corporation-163634265.html?src=rss

    Go Here to Read this Fast!

    OpenAI’s for-profit plan includes a public benefit corporation

    Originally appeared here:

    OpenAI’s for-profit plan includes a public benefit corporation

  • A four-pack of Apple AirTags is on sale for a record low of $70

    Lawrence Bonk

    If you’re constantly losing your stuff, or know someone who is, now’s a great time to invest in a few AirTags. A four-pack of Apple’s Bluetooth trackers are on sale for $70 at the moment, which is a record-low price and a bit cheaper than they were during the Black Friday shopping period. You’re getting a $30 discount on the pack, and it breaks down to only $17.50 per tracker.

    Apple AirTags easily made our list of the best Bluetooth trackers, and this is especially true if you’re already tied into the company’s ecosystem. The finding network is vast and comprehensive, which really helps when it comes time to actually find one of these tags. Just think of all of those AirTags, iPhones and other devices out there in the world helping to create this network.

    These trackers can also tap into the ultra-wideband (UWB) wireless protocol, which creates a sort of game out of finding a lost item in the home. As long as the object is within 25 feet of your smartphone, the screen will display directional arrows and a distance meter. This lets you zero in on the object without having to constantly ring the AirTag.

    Now onto the caveats. AirTags really only work with iPhones and other Apple devices, so Android users should keep shopping for something else. Also, the ringer only pings for seven seconds at a time, which can make finding something feel like a mad dash. Finally, there’s no attachment point for connecting to a keychain or a related accessory. Luckily, there are all kinds of amazing AirTag accessories to get that job done. One recent case even comes with batteries that will power the tag for a full decade.

    Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

    This article originally appeared on Engadget at https://www.engadget.com/deals/a-four-pack-of-apple-airtags-is-on-sale-for-a-record-low-of-70-161406138.html?src=rss

    Go Here to Read this Fast! A four-pack of Apple AirTags is on sale for a record low of $70

    Originally appeared here:
    A four-pack of Apple AirTags is on sale for a record low of $70

  • How to use Visual Intelligence, Apple’s take on Google Lens

    Lawrence Bonk

    The recent rollout of iOS 18.2 finally brings many of the promised Apple Intelligence features, like Genmoji and Image Playground. One such long-awaited tool is Visual Intelligence, a feature currently reserved for the iPhone 16 Pro and Pro Max that was first introduced at the company’s September event.

    Visual Intelligence is Apple’s answer to Google Lens. It leverages the camera system and AI to analyze images in real-time and provide useful information. This can help people learn more about the world around them and is particularly handy for shopping, looking up details about a restaurant or business, translating written text, summarizing text or having something read aloud. It can also integrate with Google Image Search and ChatGPT. 

    There are two caveats. The Apple Intelligence rollout has been something of a convoluted mess, and this trend continues with Visual Intelligence. For now, the tools only work with the iPhone 16 Pro and Pro Max, which are the beefiest of the company’s recent handsets. Apple has indicated that the feature could eventually become available for older models. Google Lens, after all, has been around since 2017, which was when the Pixel 2 was the hottest handset on the block.

    There’s also a wait list, which is true of all Apple Intelligence features. To join the list, head to settings and look for “Apple Intelligence & Siri.” Then click on “Join Waitlist.” Once approved, the software will be ready to use.

    As of this writing, the only way to launch Visual Intelligence is to long-press the Camera Control button. That’s the new control interface on the bottom right side of the handset. Once pressed, the Visual Intelligence interface will open up. 

    A button.
    Apple

    Now the fun begins. Just point your phone at something and select ChatGPT, via the bottom left icon, or Google Image Search, via the bottom right icon. Alternatively, if the visual field includes text, tap the circle at the bottom of the screen. The phone can also be pointed at a business to obtain useful information. 

    Hover the phone in front of the text, activate Visual Intelligence and tap the circle at the bottom of the screen. This will analyze the text. Once analyzed, there are a few options. Tap “Translate” at the bottom of the screen to translate the text into another language. Tap “Read Aloud” if you want the text to be read aloud by Siri. Tap “Summarize” for a quick summary of the copy.

    The tool will also identify contact information in the text, like phone numbers, email addresses and websites. Users can take action depending on the type of text. For instance, tap on the phone number to give it a ring. Other actions include starting an email, creating a calendar event or heading to a website. Tap the “More” button to see all of the available options. Tap “Close” or swipe up to end the session.

    Visual Intelligence can provide details about a business that’s directly in front of you. Just open up the tool and point the camera in front of the signage. The name of the business should appear at the top of the screen. Tap “Schedule” to see the hours of operation or tap “Order” to buy something. View the menu or available services by tapping “Menu” and make a reservation by touching “Reservation.” To call the business, read reviews or view the website, tap “More.”

    Swipe up or tap “Close” to end the session. This feature is currently only available to US customers.

    Start by pointing the camera at an object. Activate Visual Intelligence and tap the ChatGPT icon on the bottom left side of the screen. Tap the “Ask” button for information about the object. We used it on a bottle of hand cream, which it properly identified. After that, a text field will appear for follow-up questions. Users can ask whatever they want, but results may vary. We asked ChatGPT where to buy the hand cream and how much it costs. It performed admirably at this task. Yay shopping.

    Integration.
    Engadget/Cherlynn Low

    Tap the “Close” button or swipe up to remove all fields, which will also shut down Visual Intelligence. 

    Choosing Google Image Search will bring up a Safari dialog box that contains similar photos pulled from the web. A good use case here is finding deals. We took a photo of a bottle of hand cream and the Safari results had plenty of different price points to choose from. However, users have to find the best deal and complete a purchase on their own.

    The tool in action.
    Engadget/Cherlynn Low

    Tap the “Close” button to eliminate these results and then swipe up from the bottom of the screen to shut down the tool.

    This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/how-to-use-visual-intelligence-apples-take-on-google-lens-150039141.html?src=rss

    Go Here to Read this Fast!

    How to use Visual Intelligence, Apple’s take on Google Lens

    Originally appeared here:

    How to use Visual Intelligence, Apple’s take on Google Lens