Monday 31 October 2022

Battery-Powered Shoe Attachment Boosts Walking Speed by 250 Percent

(Photo: Shift Robotics)
What if you could walk really fast? Okay, not Sonic the Hedgehog fast…and not Quicksilver fast, either. Think more along the lines of nearly three times faster than you already cruise through the park or through a dying shopping mall.

Moonwalkers claim to help you accomplish that. Designed by a team of Carnegie Mellon University engineers who banded together to found Shift Robotics, these battery-powered attachments strap onto almost any pair of shoes to give you an enviable speed boost. Instead of free-wheeling like roller skates, Moonwalkers’ eight polyurethane wheels work with a set of built-in sensors to switch between “lock” and “shift” modes, which prevent the wheels from spinning when the wearer is navigating stairs, using public transit, or otherwise requiring full motion control. These modes also help the wearer stop within one meter even at top speed, which is said to be 250 percent faster than the wearer’s normal walking speed.

The attachments’ chassis are made entirely from aluminum to prevent crushing and assist in thermal management. The 300-watt electric motor, which powers the wheels for up to six miles of active use per charge, is fully sealed to protect against water and debris ingress. According to Shift, this is what allows Moonwalkers to navigate puddles and sidewalks that are in less-than-perfect condition. Because Moonwalkers are designed to match the wearer’s gait, there’s said to be zero learning curve, which can’t be said for conventional equipment like roller skates and rollerblades.

It’s hard to avoid wondering if Moonwalkers are a solution to a problem that doesn’t exist. After all, a full range of motion is still required to use them; they don’t serve as a mobility aid for those who can’t already walk, and since they use wheels, Moonwalkers are rendered pointless on most unpaved surfaces. (You certainly can’t bring them on a hike or to the beach, where walking is arguably more exhausting.) Shift Robotics seems to be positioning its attachments as a way to make city life a bit more efficient: “With Moonwalkers, you can pick up your dry cleaning across town, carry those grocery bags a little easier, grab those last-minute dinner items much quicker, or whatever else with much more ease,” its Kickstarter page reads.

But at $799 to $1,299 per pair (depending on the Kickstarter campaign’s progress), the cost of that added efficiency is pretty steep. This means Moonwalkers’ target audience is quite small: Frugal budgeters, rural dwellers, and those who like to stop and smell the roses need not back this project.

Now Read:



AMD Rumored to Launch Two High-End RDNA3 GPUs This Week

It’s been a time of glorious bounty in the PC hardware universe unlike any other. We’ve seen powerful new CPU architectures from both Intel and AMD launched within weeks of each other. We also witnessed the unabashedly potent RTX 40-series launch. It’s been an exciting time to be a PC gamer.

Now there’s just one more highly anticipated arrival that will effectively wrap up the party: AMD’s next-gen GPUs. These will truly be something special: the first chiplet-based GPUs for consumers. The 7900X was an obvious product, but the existence of a second, more-powerful GPU is the big news here. It seems AMD is following Nvidia’s strategy of launching two high-end cards first. As you recall, Nvidia “unlaunched” the third 40-series GPU previously.

The latest rumors indicate AMD will be launching the RX 7900 XTX and the RX 7900 XT this week. Yes, that is a lot of Xs. It’s also surprisingly bringing back the XTX moniker, which it hasn’t used since 2006 for the X1900 XTX. Both GPUs will seemingly take on the RTX 4090 at the top of the stack for RDNA3. Both GPUs will be Navi31, which is the big die of the family. The XTX version will be the full version of the chip, with the XT version being slightly cut-down. The flagship should come with 24GB of 20Gb/s memory on a 384-bit memory bus, which aligns with previous rumors. This should give it almost a terabyte of memory bandwidth. That number doesn’t include the benefits of its infinity cache.

This fan-made render of the flagship Navi31 GPU looks tantalizing. (Image: @Technetium)

It’s expected to offer up to 96MB of cache this time, according to TechSpot. That’s a small reduction from the 128MB it offered on the 6900 XT. However, it could be using its V-Cache technology to vertically stack some of it too. It was previously rumored it might offer as much as 384MB, which would be truly nuts. The XTX card is reported to boast 12,288 streaming processors. That’s more than double the 5,120 found in the previous flagship; the Radeon RX 6900 XT. Its power consumption is still a big unknown here. Previously AMD stated it’ll be going up this generation. Since the 6900 XT was a 300W card, we can comfortably predict it’ll be 375W or so.

The cut-down XT version will use 20GB of GDDR6 memory across a narrower 320-bit bus. It’ll allegedly offer 800GB of memory bandwidth, which is more than the RTX 4080 16GB, not accounting for infinity cache benefits. It’s possible this card could land in between Nvidia’s top GPUs though. It’ll offer 10,752 stream processors, more than double that of the 6800 XT’s 4,680.

Notably, neither GPU will be using the newfangled 12VHPWR connector. Instead, it’s using the tried-and-true eight- and six-pin connectors we all have now. This will come as good news for those who have been following GPU news. Nvidia is currently embroiled in a controversy concerning its 4-into-1 PCIe power adapters melting. It’s supposedly caused by a suboptimal soldering job on the wires inside the plug that goes into the card. The hullabaloo caused an AMD executive to confirm on Twitter it wasn’t using the 12-pin connector.

Sadly, we must report that although AMD is announcing these GPUs on Nov. 3, it might be another month before they reach retail. Noted tipster Greymon55 recently posted that they won’t go on sale until December. They were originally supposed to be offered two weeks after the reveal, in late November. Pricing and board power are still TBD, but we shall find out soon enough.

Now Read:



UFO Sightings Are Usually Foreign Spying or Space Junk, Government Sources Say

(Photo: Stefan-XP/Wikimedia Commons)
Many UFO sightings recorded over the last several years can likely be attributed to far more ordinary circumstances than originally thought. Instead of extraterrestrials, most strange aerial activity can be chalked up to foreign countries’ surveillance operations and space debris, say government officials.

The last year has seen the US government actually begin to pay attention to possible alien activity. In 2021 the Office of the Director of National Intelligence conducted an unprecedented preliminary assessment of Unidentified Aerial Phenomena (UAPs), or the government’s term for UFOs. The assessment aimed to determine how federal agencies could begin investigating, classifying, and addressing 144 UAP sightings from the previous 17 years. Most of these sightings weren’t explainable at the time, but this week, intelligence agencies will provide Congress with an update…and it’s a little less flashy than anticipated.

According to a handful of government officials who spoke in confidence with The New York Times last week, many UAP sightings from the last decade or so are bound to be the result of foreign surveillance or space junk. Some incidents have officially been attributed to Chinese surveillance, which the country conducted with “relatively ordinary drone technology.” Others are thought to be associated with China but haven’t formally been classified as such. Based on what people close to the matter told the New York Times, China has a vested interest in learning how the US trains its military pilots. The best place to figure that out, Chinese intelligence agencies seem to believe, is high in the sky.

NASA’s BARREL balloon admittedly looks a bit odd from the ground. (Photo: NASA/Goddard/BARREL/Brett Anderson/Wikimedia Commons)

Other sightings have been attributed to space junk, or retired satellites and other equipment that remain in orbit. Even less exciting is the fact that some “flying saucers” have reportedly been weather balloons, which sometimes look a bit otherworldly from down on the ground.

But if so many UAPs are actually the result of familiar, Earth-based activity, why have they been considered UAPs for so long? “In many cases, observed phenomena are classified as ‘unidentified’ simply because sensors were not able to collect enough information to make a positive attribution,” Defense Department spokesperson Sue Gough told The New York Times. Impractical sensor optics and illusions caused by water have also made certain aerial phenomena look a lot more unique (and worrisome) than they actually are. According to Gough, the Pentagon—and likely NASA, thanks to its new independent UAP study—is working to build upon its analysis methods in the future so that normal activity doesn’t get stuck in UAP status for so long.

Now Read:



Perseverance Rover Prepares to Drop Off Mars Samples for Return Mission

The Perseverance rover is a well-equipped robot with a gaggle of cameras, multiple spectrometers, and even a little box that makes oxygen. You can’t get every possible scientific instrument on a Mars-bound rover, though. To really understand the red planet, we need to get samples back to Earth, and Perseverance is preparing to take the next step in making that happen. NASA and the ESA have agreed on a location for Perseverance to deposit the first sample cache, which could be retrieved a few years down the line in the NASA-ESA Mars Sample Return Campaign.

During its time on Mars, Perseverance will analyze numerous samples with the tools at its disposal, but the team is also carefully curating a collection of samples that will come back to Earth. The rover was designed with an innovative sample caching system, which packages up rock cores in pristine metal tubes that will protect them from contamination on the return journey. So far, Perseverance has collected 14 rock-core samples in the tubes. The robot has several dozen sample tubes at its disposal.

NASA and the ESA have agreed that the first batch of samples will be deposited at a site known as Three Forks, near the base of the ancient river delta in Jezero Crater. The mission to retrieve the samples is still evolving, so the team has to make some guesses about how the Return Campaign will work.

A few months ago, the agencies updated the plan to drop a second rover that was supposed to fly with the return vehicle. Now, Perseverance will be the primary means of getting samples to the Mars Ascent Vehicle (MAV). It will also have a pair of helicopters based on Ingenuity’s wildly successful design. The tube cache at Three Forks will act as a backup in the event Perseverance cannot rendezvous with the MAV or an issue pops up in the sample caching system.

While Perseverance drops off its first collection of samples, engineers back on Earth have begun the process of testing hardware for the return campaign. In what is known as “Phase B,” the team is working to develop prototypes that will eventually become the final flight hardware, which will hopefully have no defects or software glitches. There’s enough that can go wrong without hardware failures. After landing in Jezero Crater, the MAV will deposit the recovered tubes into a rocket that blasts them into orbit. At that point, an ESA spacecraft will have to pick them up and make its way back to Earth. If all goes as planned, the samples could be back on Earth as soon as 2033.

Now read:



Friday 28 October 2022

Musk Completes Twitter Takeover, Promises It Won’t Become a ‘Hellscape’

It finally happened — after months of snide public statements and legal wrangling, Elon Musk has acquired Twitter for the original $44 billion offer. Musk wasted no time firing several executives, including CEO Parag Agrawal, almost the instant the deal was done. Musk has hinted at some major changes to the way Twitter works, including the end of lifetime bans, but he promises it won’t become a “free-for-all hellscape.” It may not be possible to thread that needle, though.

According to Musk, who has a penchant for grandiose overstatements, he purchased Twitter because he wants to help humanity. Also among those immediately fired by Musk was Twitter’s head of policy, trust, and safety, Vijaya Gadde. Musk has harshly criticized Twitter’s handling of numerous disputes, which sometimes involved temporary and lifetime bans on controversial figures. Reports now indicate that Musk plans to end the practice of permanently banning users.

Conservative users, who largely believe Twitter discriminates against them, are cheering the news. Musk has spent the past few years adopting right-wing talking points around COVID and criticizing Democrats, so they do have reason to be hopeful. Should Musk follow through on his apparent intention to reverse lifetime bans, that could mean the return of former president Donald Trump, who was permanently banned from Twitter following the January 6th Capitol riot. Trump has since launched his own social network known as Truth Social, but it’s apparently floundering. Returning to Twitter is probably what he’s wanted all along, anyway.

Musk says he plans to make some changes to the company’s suspension policy quickly, so accounts banned for harassment and misinformation could reappear soon. In addition to Trump, conservative political actors on Twitter are begging Musk to reinstate the accounts for Infowars and its founder Alex Jones, as well as Project Veritas, Milo Yiannopoulos, and election conspiracy theorist Mike Lindell.

We might soon get a preview of how, if at all, Musk will attempt to reign in misinformation on his new social network. As the deal closed, news broke of a break-in at the home shared by US Speaker of the House Nancy Pelosi and her husband Paul Pelosi. While the Speaker was in Washington at the time, reports say Paul was badly injured and is currently recovering in the hospital. Conservatives on Twitter have started alleging this attack was a false flag aimed at influencing the upcoming election — they have compared Pelosi to actor Jussie Smollette, who was convicted of concocting a fake assault story several years ago. It’s trending right alongside news of the break-in. Should this become a new election conspiracy theory, the old Twitter would have taken action, but will Musk?

Right now, it sounds like Musk wants to have it both ways. He’s committed to running Twitter as a “digital town square” that allows diverse opinions, but he also says it must be “warm and welcoming to all.” Regardless of what Musk says during this pivotal time for the company, it seems inevitable that Twitter will spend more time targeting bots than misinformation. Some people may prefer that, but it could make the site, if not a hellscape, at least a lot more combative. Musk got himself into this mess, and he’ll be blamed for whatever happens.

Now read:



These Haptic Gloves for the Metaverse Require ‘Airpack,’ Cost $495 Per Month

(Photo: HaptX Inc.)
Virtual reality (VR) experiences can be impressively realistic. In fact, some are so convincing that users forget they’re seeing something fictional or end up using VR to treat certain phobias. But what if you could take this realism a step further by physically feeling the items you encounter in-game? That’s the question behind HaptX Inc.’s new haptic gloves, which use tactile responses to make VR experiences feel more immersive.

HaptX Gloves G1 incorporate two kinaesthetic feedback gloves and a wireless “Airpack,” the latter of which can either be worn by the user like a backpack or set on a table nearby. The Airpack is responsible for generating compressed air and controlling the air’s flow, both of which are essential to conducting detailed haptic feedback. The gloves themselves, which come in four sizes, contain hundreds of microfluidic actuators that displace the skin with each pulse. This gives the user the sensation that the objects they’re touching and interacting with in the virtual space are real.

The product is meant for the “enterprise metaverse,” or VR experiences created by or dedicated for corporate entities. As HaptX suggests in the video above, some larger organizations have recently expressed interest in conducting training within the metaverse, while others believe it could transform the way people shop. If that interest endures the metaverse’s shaky future and turns into true commitment, companies might be able to forgo physical training materials and interactive product displays in favor of virtual ones.

Whether that’d be financially viable depends on the company. HaptX Gloves G1 cost $5,495 per pair with a $495 monthly fee. The fee is part of the unavoidable HaptX Subscription program, which includes service, maintenance, and the company’s software development kit (SDK). The HaptX SDK is said to enable developers to incorporate Unreal Engine and Unity plugins, utilize C++ API, and take advantage of features already built into the product, like advanced vibrotactile feedback and a haptic multiplayer feature that allows multiple users to “feel” the same objects. (Sound familiar?)

HaptX argues its Gloves G1 are far more accessible than their predecessor, the HaptX Gloves DK2, which cost over $10,000 per unit, which is pricey even for the enterprise VR scene. Still, despite the HaptX Gloves G1’s cost and physical bulk, it’s impressive that users can feel different textures, manipulate virtual materials, and otherwise physically experience simulations. VR cat cafe, anyone?

Now Read:



New Transparent Solar Cells Could Help Scientists Create Energy-Generating Windows

(Photo: Isaac Burke/Unsplash)
Soon the natural light filtering through your window could do more than just brighten up your space. Scientists have achieved a level of efficiency for dye-sensitized solar cells (DSCs) that might enable the creation of energy-generating windows.

In a paper published this week in the journal Nature, researchers from Switzerland’s École Polytechnique Fédérale de Lausanne detail the way in which they helped DSCs harvest energy from the full visible light spectrum. DSCs, a type of low-cost, thin film solar cell, use photosensitized dye attached to the surface of a wide band gap semiconductor to convert visible light into energy. Despite their financial and physical practicality, they’re not as efficient as conventional solar cells, which delegates both light absorption and energy generation to the semiconductor. This means that even though energy-generating windows have technically been possible for a while, the devices wouldn’t have been worth the resources.

This new efficiency record could change that. The team in Switzerland enhanced DSCs’ efficiency by meticulously controlling the assembly of dye molecules on the cells’ nanocrystalline mesoporous titanium dioxide (TiO2) films. Pre-adsorbing a single layer of hydroxamic acid derivative on the film’s surface allowed the scientists to improve the molecular packing and performance of two custom-designed sensitizers. These sensitizers were found to be capable of harvesting light from the entire visible spectrum.

Dye-sensitized solar cells. (Image: Ronald Vera Saavedra Colombia/Wikimedia Commons)

During a simulation of standard air mass 1.5 sunlight—the air mass coefficient typically used to measure solar cells’ performance—the enhanced DSCs achieved a power conversion efficiency (PCE) of 15.2 percent. Considering the fact that 12.3 percent was the best-known DSC PCE in 2019, that figure is impressive, especially when you factor in that the enhanced cells maintained operational stability over 500 hours of testing. Better yet, when the scientists tested their enhanced DSCs on devices with a larger active surface area, they achieved a groundbreaking PCE range of 28.4 to 30.2 percent.

The team believes the enhanced DSCs could pave the way for energy-generating windows, skylights, and greenhouses in the near future. They could even find a place in low-power electronic devices, which would then use ambient light as an energy source.

Now Read:



Physical Copies of Call of Duty: Modern Warfare 2 Don’t Have the Game on the Disc

Call of Duty is one of the most popular video game franchises in history — Sony has pointed out, as it tries to keep Microsoft from acquiring Activision Blizzard. CoD could almost be considered a genre all by itself. Naturally, the release of a new game in the series is a big deal, but as gamers excitedly tear into their physical copies of Call of Duty: Modern Warfare II, they’re finding a nasty surprise. Instead of the game, discs have just 72MB of data and the rest must be downloaded.

The disc versions of the game appear to be functionally identical to digital copies. Instead of buying the license and downloading it, you purchase a disc that ships to you with a license key. Pop that in your console, and you don’t get to play the game right away — you have to download the game just like those who bought digital codes. So, the production and shipping of these discs is a complete waste of time and resources.

Call of Duty: Modern Warfare II is a roughly 35GB game, which is small enough to fit on modern Blu-Ray game discs. However, with the addition of a day-one patch, MWII balloons to a ridiculous 150GB on the PS5. Ignoring for a moment how silly it is to ship a disc with nothing on it, it’s unacceptable that the publishers didn’t tell people this is what they were buying. There are good reasons people might want the disc version of a game.

Playing games via physical discs isn’t as common these days as it used to be, but it provides an important option for people who don’t have enough internet bandwidth to download dozens of gigabytes. Many ISPs also cap data usage, so you may not want to download enormous games. In the US, which generally has higher data caps than many countries, Comcast caps residential users to 1.2TB per month. Downloading Modern Warfare II would eat up more than 10% of that allotment, even if you went out of your way to buy a disc version of the game. Plus, a disc is more likely to work in the absence of online services, which publishers like to shut down to save money once a game is no longer popular. It appears none of those advantages matter to Activision.

Now read:



DOJ Launches Criminal Probe Examining Tesla’s Autopilot Crashes

Tesla has leaned into autonomous driving like few other automakers with its Autopilot system, which has been a core feature of its vehicles since 2013. However, that push could get the company in trouble. A new report says that the US Department of Justice has opened a criminal investigation of Tesla following a series of crashes and deaths related to Autopilot.

Initially, Tesla required customers to pay extra for Autopilot, priced between $5,000 and $8,000, depending on features. As it expanded to produce less expensive vehicles, Tesla included the basic Autopilot feature set for no additional charge. As more drivers started using Autopilot, we started to see reports of accidents where Autopilot was in complete control of the car. Tesla has since expanded the “Full Self-Driving (FSD)” features of its vehicles to make the cars more reliable and able to drive themselves in more situations. And yet, Autopilot is not true self-driving.

At issue is the way Tesla advertises and discusses Autopilot. While the company’s more careful disclaimers note that drivers have to keep their hands on the wheel, anyone who has driven in a Tesla knows that the vehicle will often let you zone out for long periods of time without any nudges. And then there’s the way Tesla CEO Elon Musk talks about Autopilot. A promotional video on Tesla’s website features Musk saying that the driver is only there “for legal reasons,” and “the car is driving itself.”

Overhead signs present a challenge for autopilot systems

Reuters reports that the DOJ investigation started last year and could be a more serious threat to Tesla than the various state-level investigations already pending. The case could result in criminal charges against individual executives or the company as a whole, sources have said. However, charges would most likely require that evidence of intentional misrepresentation is uncovered in the probe. If not, Tesla can always point to its disclaimers as legal cover, even if Musk is out there making wild claims about Autopilot’s capabilities. The National Highway Traffic Safety Administration is also investigating crashes in which Teslas were in Autopilot mode.

Tesla does not have a media office — it only has Elon Musk, who has been too busy closing his Twitter acquisition to tweet any statements about this report. This comes as Tesla has been paring back the sensors in its cars, which has made some Autopilot features unavailable as the company works to update the system to rely solely on camera input. Tesla is not alone in struggling to perfect self-driving technology. After years and billions of dollars, big players like Google and Uber are still struggling to make vehicles that can drive as well as humans.

Now read:



NASA Borrowing Parts From Mars Mission to Put an Earthquake Detector on the Moon

NASA’s InSight Mars mission is winding down, and while it never managed to get the burrowing heat probe to work, InSight is still a huge success thanks to its groundbreaking seismometer. Now, the first seismometer to operate on another planet is making history again. Spare parts from the Seismic Experiment for Internal Structure (SEIS) will form the basis for a seismic instrument that will make its way to the far side of the moon in 2025.

SEIS was designed and developed by the Institut de Physique du Globe de Paris (IPGP) and the French CNES space agency. Work began back in the 90s, and eventually the project was chosen to fly on InSight, which reached Mars in 2018. As part of the development process, engineers built a duplicate seismometer that is still on Earth. Parts of this device will be integrated into the Farside Seismic Suite (FSS) that NASA plans to deploy to Schrödinger crater on the far side of the Moon.

SEIS (above) featured three ultra-sensitive pendulums spaced 120 degrees apart, allowing it to detect movement in any direction as little as 10 picometers. That’s smaller than the width of a single atom. This incredible precision allowed NASA to record hundreds of Marsquakes, far more than scientists expected to detect. For the FSS, one of the backup SEIS pendulums will become the Very Broad Band (VBB) seismometer for the mission, which will measure vertical ground vibrations. A second instrument known as the Short Period Seismometer (SPS) will monitor movement in other directions.

The existing SEIS hardware was already a good match for the proposed lunar application, according to Gabriel Pont, who manages the FSS project at CNES. “The Farside Seismic Suite seismometer will be tuned for lunar gravity. It will be placed in a vacuum protection case called seismobox,” Pont told Ars Technica. The team expects the 40-kilogram FSS lander to have similar sensitivity to SEIS on Mars, making it about 10 times better than the last seismometers deployed on the moon during the Apollo program.

NASA has awarded the contract for transporting the Farside Seismic Suite to Draper Laboratory. The lander (see above), is just the vehicle for getting the FSS to the surface. The instruments will be independent of the lander with their own solar panels, communication, and heaters. To save power, the FSS will not transmit data during the lunar night, but it will connect to an orbiter while in sunlight to upload data. NASA is paying $73 million to Draper for this landing under the Commercial Lunar Payload Services program, which is currently set for May 2025.

Now read:



RTX 4090 Cable Gate Saga Continues, Nvidia’s Adapter Possibly to Blame

(Photo: /u/NoDuelsPolicy on Reddit)
This week we’ve seen more reports of fried RTX 4090 adapters, with the count now up to a half dozen or so. Previously we just knew that some adapters were melting, and in doing so also damaging the connector on the card. It wasn’t clear exactly what was to blame for this situation. Was it the adapter, the bending of it, or some other engineering mishap? Now the intrepid tinkerers at Igor’s Lab have revealed what they believe is the culprit: the Nvidia-designed 4-into-1 adapter. This takes four 8-pin cables and routes them into a single 16-pin plug. Its poor construction is likely the cause of the issue, and photos reveal it to be a hot mess, pardon the pun.

Previously all we knew was that adapters were melting. It was theorized it was somehow related to bending the adapter, which is required due to the size of the GPUs. Their width places the connector close to most cases’ side panel, necessitating an almost 90-degree bend to move the cable out of the way. This isn’t rocket science; you don’t want to put a serious bend on any electrical connection. This might not be an issue if the connection was as solid as a rock. However, it turns out it’s not very solid at all. This is seemingly due to insufficient soldering around the wires, which is meant to keep them in place. Igor’s Lab took one of the adapters apart to investigate, and the results aren’t pretty.

The exposed connector looks like a lab experiment, not something provided with a $1,600 GPU. (Image: Igor’s Lab)

According to his investigation, there are six contact patches and four wires on each side of the adapter. Two on the edges are soldered to one point each, and the four in the middle are connected to two wires each. You can see in the photo there’s a surprisingly small amount of solder used for these connections. Igor says the connections use a 0.2mm copper base with 2mm of solder per wire. That leaves 4mm of solder for the twin cable connections.

For 14 gauge wire, that’s not much. Now just imagine bending those wires at a right-angle while they are hot, and you can see the problem. This could cause the solo cables at the edge to get loose first. Unsurprisingly, that’s exactly what we’ve seen in several of the reports so far (see below). When that happens all the current will flow through the remaining wires, so they will heat up drastically.

A melted adapter via Reddit.

To summarize the findings, the problem is not with the actual cables coming from the PSU. Nor is it with the connector on the PCB. It seems to lie exclusively with the adapter designed by Nvidia. This is included with every Ada GPU and was made by a third party for Nvidia. Igor assumes Nvidia wasn’t aware of how poorly they were made, or it didn’t examine them too closely. If it had, it would never have allowed its add-in board (AIB) partners to include them with its flagship GPUs.

For now, Nvidia is still investigating according. In an update, Igor’s Lab says Nvidia has reached out to its board partners and asked to have all damaged GPUs sent to HQ for analysis. The next logical step would be for Nvidia to announce a recall for the adapters. It should have new adapters made, and replace them for free. That could take some time, obviously, and will seriously piss off current RTX 4090 owners. However, those gamers will likely prefer a safer adapter in the long run. They could alternatively purchase an ATX 3.0/PCIe Gen 5 power supply, but those are still hard to find. Plus, they will be expensive. Some companies like Seasonic and CableMod are also offering 90-degree adapters as well, but they’re not for sale yet.

Seasonic’s upcoming 90-degree 12VHPWR cable. (Image: Bilibili)

Until we hear from Nvidia officially, our advice is pretty simple: be careful with your adapter, and don’t bend it. If it’s currently bending due to your case’s side panel, take off the side panel.

Now Read:



Thursday 27 October 2022

Our Search for Habitable Planets Just Got a Lot Narrower, Study Suggests

In the past few decades, astronomy has gone from speculating about the existence of planets outside our solar system to identifying more than 5,000 of them. Now, the hunt for habitable exoplanets is on, but we may have a smaller pool of possible candidates, according to a new study from researchers at the University of California, Riverside. After analyzing a nearby exoplanet, the team has concluded it is unlikely that the most common type of star in the Milky Way is capable of supporting life as we know it.

The study focused on an exoplanet called GJ 1252b, which orbits an m-dwarf star (sometimes called a red dwarf) just 66 light years away. That’s right next door in astronomical terms. These small, long-lived stars are so numerous that discovering habitable planets around them could mean a much higher chance of finding extraterrestrial life.

However, there’s one big problem: An exoplanet close enough to a red dwarf to have liquid water would also be subjected to intense radiation and unpredictable solar flare activity. Could such a world even maintain an atmosphere? The proximity of GJ 1252b provided the Riverside researchers with a chance to find out one way or the other.

Astronomers measured infrared radiation from GJ 1252b during a so-called “secondary eclipse,” when the planet passes behind a star that blocks both its light as well as light reflected from its star. The radiation readings suggested the exoplanet, which completes an orbit of its host star in just 12 Earth hours, has a surface temperature of 2,242 degrees Fahrenheit (1,226 degrees Celsius). That’s hot enough to melt gold, silver, and copper. This led the researchers to conclude that GJ 1252b does not have an atmosphere.

The Trappist-1 system consists of several rocky planets orbiting an m-dwarf star. The Webb telescope will soon conduct new observations of these worlds.

Further, the team calculated what it would take for the exoplanet to have any atmosphere whatsoever in the face of such intense solar activity. Even with carbon levels 700 times higher than Earth, they estimate GJ 1252b would still have its atmosphere stripped away. “It’s possible this planet’s condition could be a bad sign for planets even further away from this type of star,” says Riverside astrophysicist Michelle Hill.

This doesn’t mean habitable planets are out of the question, though. While most of the 5,000 stars near Earth are m-dwarfs, about 1,000 are similar to the sun and could potentially host Earth-like planets. And the universe is a big place — there are always more stars to survey, and new instruments like the James Webb Space Telescope will help us observe them in greater detail than ever before.

Now read:



NASA Opens Investigation Into Recent UFO Sightings, Hopes They’re ‘Not an Adversary’

(Photo: NASA)

We could be closer to confirming the presence of extraterrestrial life than ever before. NASA has officially kicked off its study of UFOs (officially UAPs, or unidentified aerial phenomena), and 16 top scientists and scholars will be leading the effort.

In a statement released Friday, NASA announced that it had chosen the individuals responsible for conducting its independent UAP study, which is separate from the Department of Defense’s UAP Task Force and its successor, the Airborne Object Identification and Management Synchronization Group. The team is impressively diverse: Two astrophysicists, two policy specialists, two aviation specialists, an oceanographer, an AI startup founder, a science journalist, a planetary scientist, a former NASA astronaut, a telescope scientist, a space infrastructure consultant, an electrical and computer engineer, and a physicist each made the cut.

Beginning this week, the team will be responsible for building a foundation upon which NASA and other agencies can continue studying UAPs. Focusing entirely on unclassified data, they’ll help determine how information sourced from civilians, government entities, and private enterprises can be analyzed and potentially used for future UAP discernment. The study will coincide with NASA’s aircraft safety goals, given UAP’s potential effect on air safety and national security overall.

A UFO photographed over Lake Cote, Costa Rica in September 1971. (Photo: Instituto Geográfico Nacional de Costa Rica/Wikimedia Commons)

“Exploring the unknown in space and the atmosphere is at the heart of who we are at NASA,” said Thomas Zurbuchen, associate administrator of the Science Mission Directorate at NASA Headquarters. “Understanding the data we have surrounding unidentified aerial phenomena is critical to helping us draw scientific conclusions about what is happening in our skies. Data is the language of scientists and makes the unexplainable, explainable.”

Interest in official UAP research has been growing over the last several months. In May, the Pentagon briefed Congress on a number of UAP sightings, only one of which has since been explained. The hearing, along with the 2021 preliminary UAP assessment that inspired it, earmarked this decade as the first in which the US government has (publicly) shown a serious interest in investigating what could be otherworldly phenomena.

According to NASA chief Bill Nelson, humans very well could have witnessed extraterrestrial activity in recent years. During a live-streamed interview with the University of Virginia last year, Nelson said hundreds of acknowledged UAP sightings still lack explanations. “I’ve talked to those pilots and they know they saw something, and their radars locked on to it. And they don’t know what it is. And we don’t know what it is,” he said. “We hope it’s not an adversary here on Earth that has that kind of technology. But it’s something… Who am I to say planet Earth is the only location of a life form that is civilized and organized like ours?”

Now Read:



Microsoft to Add Android 13 and New Features to Windows 11

Windows 11 brought a raft of new features like a revamped start menu, more control over snap layouts, and integration with more Microsoft apps and services. One of the biggest additions was the Windows Subsystem for Android, which allows your PC to run Android apps. The subsystem rolled out as a preview, but it’s already left that caveat behind in the big Windows 11 2022 update. Microsoft isn’t resting on its laurels, either. It already plans to improve the subsystem with a move to the latest Android 13 OS.

The Windows Subsystem for Android (WSA) is a virtual machine built into Windows, and it’s currently based on Android 12L (an upgrade over the original Android 12 distribution). Android is open source, so we have more insight into what Microsoft is doing with this feature than we do with other aspects of Windows 11. Over on the WSA GitHub page, Microsoft has posted a roadmap detailing the features it has added so far, along with the ones it plans to implement in the future. And right at the top of the list is Android 13.

Android 13 launched just a few weeks ago on Pixel phones, and Samsung only started rolling out the update to some versions of the Galaxy S22 in late October. It’s not the biggest update — most of the user-facing changes come in the form of enhanced theme support, which won’t matter on Windows, but the system optimizations and API updates are essential to future functionality. Microsoft’s apparent drive to keep the WSA updated is refreshing. Many Android-on-Windows projects have fallen behind, making it frustrating to run Android apps on a PC.

In addition to updating the WSA to Android 13, Microsoft plans to implement file transfer support, making it mercifully easier to move files between the Windows and Android parts of the system. Microsoft will also add default access to the local network, another way to make the Windows Subsystem for Android less compartmentalized.

Another planned addition is shortcuts, an Android feature that lets apps link to specific functionality. For example, a messaging app can provide a shortcut to a frequent contact that saves you from navigating to the conversation manually every time. There’s one more convenience feature on the way: picture-in-picture mode. That means Android media apps on Windows will be able to draw over the top of the Windows UI.

The roadmap doesn’t include any proposed dates or even a general timeline for these improvements. Microsoft was only a few months behind implementing Android 12L, and it recently opted to change the way it releases updates for Windows. No more will we have to wait for major semi-annual releases to get new features. So, we can expect the WSA enhancements to arrive whenever they’re ready.

Now read:



Apple Will Begrudgingly Switch iPhone to USB-C, Exec Confirms

(Photo: Mishaal Zahed/Unsplash)
At long last, Apple is bringing USB-C to the iPhone—it’s just not exactly thrilled about it. During a media interview Tuesday, an Apple marketing executive confirmed the company would be complying with the European Union’s new law requiring all mobile devices to charge via USB-C.

The confirmation came during The Wall Street Journal’s Tech Live conference this week. Reporter Joanna Stern sat onstage with senior VP of global marketing Greg Joswiak and software VP Craig Federighi to discuss Apple’s overarching product innovation philosophy. Stern asked the duo how Apple planned to approach the EU’s USB-C requirement, which was solidified this month after a year of legislative work.

“Obviously we’ll have to comply, we have no choice,” Joswiak said. “We think the approach would’ve been better environmentally, and better for our customers, to not have the government be that prescriptive.”

From an environmental standpoint, Joswiak worries the EU’s new requirement (which will take effect in 2024) will result in the same amount of e-waste it aims to prevent. iPhone users who already own Lightning cables will have to dispose of those cables as soon as they upgrade their phones. Considering Joswiak’s point that over a billion people own iPhones—and that many iPhone users have more than one charger—it’s possible that 2024 will see a drastic uptick in cable disposal.

Software VP Craig Federighi (left) and senior VP of global marketing Greg Joswiak (right) onstage at WSJ’s Tech Live conference.

But will all of us be forced to ditch our Lightning cables, or just those upgrading within the EU? Joswiak refused to say whether Apple would be switching to USB-C on all of its future mobile devices or only those it would sell across the pond. There are some customers outside the EU who’d prefer that all iPhones charge via USB-C; after all, iPads now use USB-C, so why not go all in? On the other hand, some likely share Joswiak’s sentiment that a port change would be inconvenient and wasteful.

“I don’t mind governments telling us what they want to accomplish,” Joswiak said during the interview. He pointed to times in which mobile phone manufacturers have been forced to adopt specific hearing aid compatibilities that ended up failing more often than not. Joswiak expressed that he’d rather government entities allow tech companies to find their own ways of meeting collective goals rather than prescribing restrictive methods.

Joswiak’s concerns might stem from design autonomy, but they also might be the result of Apple’s desire to enforce brand loyalty. Lightning cables are only compatible with Apple products—they can’t be used with any other manufacturer’s devices. Once a person has invested in Apple infrastructure, they might find it too inconvenient to switch to, say, a Samsung or Google smartphone. (The same concept can be seen in Apple’s commitment to keeping Android users out of iMessage.) By switching to USB-C, iPhone users within the EU are one step closer to potentially abandoning their brand loyalty and dumping Apple for good…and from a monetary standpoint, why would Apple want that?

Now Read:



Windows 11 Is Causing Issues for AMD Ryzen 9 CPUs, Nvidia GPUs

The centered Taskbar can be moved back to its usual position if you don't like the option.

Gamer adoption of Windows 11 has been an ongoing topic of interest lately. Many are still happy with Windows 10, but Windows 11 has also begun to gain popularity. People are slowly starting to come around to Microsoft’s newest OS. However, all is not rosy for those who have high-end components. New reports have surfaced that Windows 11 is causing issues with AMD’s newest CPUs with dual chiplets. There’s also a bug in the latest 22H2 update that’s affecting Nvidia GPU owners as well. No, not that bug—this is new. The Nvidia bug is simply a visual anomaly, but the AMD flaw can affect performance.

Let’s start with the good news: If you have an Nvidia GPU, and Windows 11 22H2, open Task Manager. Once there, check out your GPU usage with the system idle. You might be shocked to find it sitting at close to 100 percent for no discernible reason. This is a bug, according to Neowin. Even worse, this is a different bug from the previous one that was causing BSODs and sluggish performance.

That one has been patched, and thankfully this new one doesn’t seem to affect performance at all. But it’s reportedly affecting a wider range of Nvidia GPUs. It flips the reading for “3D” activity on its head, inverting it. So when the card is doing nothing, it shows almost 100 percent utilization. A screenshot of it was posted to Reddit by /u/washed_king_jos. Note the card is running at 39C, so this seems like a clear bug in the utilization reading.

The next bug is more serious but will affect fewer people. The Windows thread scheduler included in Windows “intelligently” assigns tasks to the CPU cores. It’s apparently having trouble with the dual chiplets in the new high-end Ryzen CPUs, including the Ryzen 9 7900X and 7950X. This was confirmed by a Twitter user named CapFrameX via TechRadar. They were able to boost gaming performance by turning off the second CCD (core compute die), and/or disabling multi-threading. Doing so improved gaming performance by 30 percent in some scenarios. For example, in Metro: Exodus the 7950X hit 151fps in stock trim, and 176fps with only one CCD.

The Ryzen 9 7950X’s dual chiplets. (Image: AMD)

This bug reportedly does not affect the Ryzen 5 7600X and Ryzen 7 7700X. That’s because those CPUs only have one CCD, adding evidence to the theory that Windows’ thread director is confused by dual chiplets. This is obviously a big problem for folks who just dropped a wad of cash on a new, high-end CPU. They will have to disable half their CPU cores just to experience the full power of their swanky Zen 4 chips.

It has not been an easy upgrade path for AMD CPU owners with Windows 11, to say the least. Last year when the OS launched it delivered a sizable L3 cache latency penalty for Ryzen CPU owners. That bug was eventually patched by both companies in an OS and chipset update. Also, “incorrect thread scheduling” was a suspected issue with Ryzen users back in the Windows 10 days as well. That was due to games running faster on Windows 7 compared with 10. In the end, AMD announced it was not a problem. Still, there’s some history here. Hopefully, AMD and Microsoft will get this new issue patched ASAP before buyers look down the stack—or across the fence at Raptor Lake.

Now Read:



NASA Forced to Fire ISS Thrusters to Dodge Russian Space Debris

The International Space Station

NASA has announced that the International Space Station (ISS) executed an orbital course correction on Monday. The ISS rarely needs to adjust its orbit, but it was necessary in this case to avoid a piece of Russian space debris. The agency says the dangerous bit of junk came from the Cosmos 1408 satellite, which Russia destroyed in an anti-satellite-weapons test in late 2021.

The crew fired the engines on Progress 81 (fittingly, a Russian cargo vessel) docked at the station for about five minutes (305 seconds). Following the maneuver, the station’s apogee (highest point in its orbit) was raised by 0.2 miles. Its perigee (the lowest point) was elevated by 0.8 miles. That was enough to steer well clear of the debris, which was projected to pass within three miles of the station. Even a small piece of space junk could seriously damage the station and risk the lives of astronauts due to its high speed relative to the ISS.

NASA says the Pre-Determined Debris Avoidance Maneuver (PDAM) on Monday evening did not impact space station operations. However, this may become a more common operation as the orbit around Earth becomes increasingly crowded. The testing of anti-satellite weapons certainly isn’t improving matters either. Russia’s decision in November 2021 to test its anti-satellite weaponry on its own satellite produced over 1,500 pieces of trackable debris, all of them potentially dangerous to space operations.

NASA played it straight when announcing the course correction, leaving its probable frustration with the Russians unsaid. The agency got about as spicy as it could when commenting on the destruction of Cosmos 1408. After the test, Russia was roundly criticized by the world’s space agencies, including NASA, which called the test “dangerous and irresponsible.” More recently, Russia has threatened to use similar weapons against SpaceX Starlink Satellites, which are providing connectivity to Ukrainian military forces in their war with Russia.

As illustrated by this maneuver, an escalation of space warfare that involves picking off satellites could put the ISS and other missions at grave risk. That’s one of the reasons the US proposed an end to orbital weapons tests several weeks ago. It’s unlikely Russia would agree to such a ban — it’s already leaning away from working with other space agencies. Earlier this year, Russia’s Roscosmos announced that it would pull out of the International Space Station after 2024 so it could focus on building its own station.

Now read:



Wednesday 26 October 2022

San Jose Earthquake Proves ShakeAlert Works as Designed

(Photo: Library of Congress/Unsplash)
Though we haven’t yet figured out how to actually predict earthquakes, a system based in California aims at giving people in earthquake-prone regions a few extra seconds to take cover. ShakeAlert is an earthquake early warning (EEW) system that initiates urgent mobile phone alerts at the start of a quake—and yesterday’s 5.1 magnitude event near San Jose proved it works as intended.

ShakeAlert first came on the scene in 2019. The US Geological Survey (USGS), California Geological Survey, California Governor’s Office of Emergency Services, and several California universities had been working together for over a decade to create an EEW system that could alert people on the west coast to impending earthquakes. When ShakeAlert first launched, it only covered California; two years later, it expanded to include Oregon and then Washington.

The system works by using geographically distributed seismic sensors to detect two types of waves: fast-moving compressional waves and slower-moving shear waves. The sensors send these signals to ShakeAlert’s data processing center. Once the processing center receives signals from four separate sensors, it prompts the USGS to initiate an alert. If the sensors receive stronger signals as the earthquake continues, the USGS will update the quake’s magnitude accordingly. Though ShakeAlert is best known for its mobile notifications, its alerts can also be distributed via radio, television, public siren, and the Federal Emergency Management Agency (FEMA) wireless emergency alert system (most recognizable for its dissemination of amber alerts).

Around the same time that it started serving Oregon and Washington, ShakeAlert sought to enhance its offering by looking at the way people respond to earthquakes on their smartphones. The service partnered with Google. During the partnership’s initial stage, ShakeAlert and Google improved their delivery of earthquake notifications on Android phones. Then they moved on to sourcing data from mid- or post-quake Google searches. The idea is that when several people search for things like “earthquake Los Angeles,” Google can use those search locations to help determine the earthquake’s spread.

It isn’t clear exactly how much progress ShakeAlert and Google have made on that front, but when a 5.1 M earthquake started in Santa Clara County, California on Tuesday, many smartphone users received notifications on their phones before beginning to feel the ground shake. “Got earthquake early warning in Daly City just before I felt the shaking. Earthquake early warning says it was magnitude 5.1 in Santa Clara County,” tweeted LA Times reporter Ron Lin shortly after the event ended. “Looks like I was 52 miles northwest of the epicenter. I thought the MyShake ShakeAlert warning was a false alarm lol, and then I felt the shaking!” Another Twitter user said they received a ShakeAlert notification seven seconds before they started feeling the quake itself.

Though a seven-second headstart might not sound significant, it can make all the difference to those in an earthquake’s radius. The gap allows people to duck under desks, steer clear of large trees, and otherwise seek cover, hopefully preventing serious injuries. The notifications are also an important facet of infrastructure protection. A well-timed alert can help trains slow down and avoid derailment, close water and gas valves to prevent utility disasters, and even inform firefighters to open firehouse doors before they can jam shut.

ShakeAlert is still under development, meaning not all smartphone users will receive earthquake notifications without taking additional action. Android users appear to be receiving warnings automatically, but ShakeAlert recommends that iPhone users download the MyShake app if they live on the west coast and would like to receive urgent notifications.

Now Read:



Analyst: Most Metaverse Projects Will Go Out of Business by 2025

(Photo: JESHOOTS/Unsplash)
Given the recent prevalence of metaverse chatter, you’d think the new online space has a long, healthy road ahead. According to one analyst, this might not be the case. Matthew Ball, chief analyst at tech market research firm Canalys, recently shared that he believes most metaverse projects will shutter over the next couple of years.

Ball shared his thought process at Canalys’ 2022 Channels Forum in Barcelona, according to a new report from The Register. Addressing whether the metaverse could be considered “the next digital frontier” or “an overhyped money pit,” Ball said Meta’s convoluted project—and all its somewhat pitiful snags—can be considered a barometer for the metaverse’s success as a whole. He went on to say he believes most metaverse projects will have closed by 2025.

Ball’s analysis is in direct contrast with a prediction recently cited by Interpol, in which market research firm Gartner said a quarter of Americans will spend at least one hour in the metaverse per day by 2026. Meta itself has an even loftier hunch: Zuckerberg said earlier this year that he expects one billion people to be in the metaverse by 2030.

But from outside of the metaverse development space, it’s far easier to understand Ball’s gloomy prediction. Despite investing billions in the platform, Meta (by and large the biggest player in the metaverse space) has had a tough time getting its version of the metaverse onto its feet. Most have heard by now that Meta has to force its own employees to use its flagship metaverse product, Horizon Worlds. Meta employees’ hesitance to use Horizon Worlds at work makes sense, given recent confirmation that working in the metaverse totally sucks.

(Image: Meta)

Some big-name tech founders have rather explicitly (in more ways than one) dismissed the metaverse, saying it’s a disappointing product that eats up resources that could be used to fix real, existing problems—not just gratuitously create new ones. That aligns with Ball’s point regarding accessibility: In the midst of a cost-of-living crisis, few people are interested in or able to invest hundreds (if not thousands) of dollars in virtual spaces like the metaverse. “People are struggling in the real world, let alone in the virtual world, to be able to invest in property and items and other NFTs,” Ball said.

An undertaking as large and convoluted as the metaverse takes a lot of faith to pursue. Meta, along with tech giants like Microsoft, Apple, and Google, appear to be capable of sustaining that faith for now, even if it’s just to prove their recent investments have been worth it. If Ball is on the right track, it’ll only be a few more years before we find out whether that’s actually the case.

Now Read:



AMD to Ditch 12-Pin GPU Power Cable as Photos Appear of Melted RTX 4090 Adapters

GPU “Cable Gate” is an official thing now. This week a Redditor posted images of a melted 12-pin adapter cable on their Nvidia RTX 4090. The power plug on the PCB was also irreversibly damaged as well. Naturally, the Internet was outraged as this was predicted to happen before the GPUs launched. Back in September PCI-SIG issued a warning about using 12-pin power cable adapters, saying it could cause problems. Now, that has apparently happened. In the wake of this event, it’s been confirmed that AMD will not be using the new cable on its upcoming RDNA3 GPUs. This has led to a lot of questions about Nvidia’s 12-pin cable design. Now Nvidia says it’s investigating the incident.

Backing up a tiny bit, the 12-pin power connector is not a new thing. Back when Nvidia launched its Ampere architecture, it also introduced a new 12-pin power connector. This change allowed for more room on the PCB, as it went from dual 8-pin connectors on high-end cards to a single 12-pin, mounted vertically. Nvidia used that extra space to improve cooling performance. The GPUs included 8-to-12-pin adapters, and all was well. However, things have changed with the introduction of the RTX 4090.

A new ATX 3.0 and PCIe 5.0 spec provides a single 12-pin cable (above) from the power supply to the GPU. This connector, dubbed 12VHPWR, differs from the Ampere cable; it has four additional signaling pins that communicate with the GPU. However, PSUs with the new cable aren’t readily available yet, so you have to use an adapter. For the RTX 4090, it’s typically a four-into-one adapter using 8-pin cables.

Previous reports suggested these adapters could cause issues with new, high-power GPUs when using ATX 2.0 power supplies and adapters. PCI-SIG sent a letter to its members warning them about this scenario. According to Wccftech, it stated, “Please be advised that PCI-SIG has become aware that some implementations of the 12VHPWR connectors and assemblies have demonstrated thermal variance, which could result in safety issues under certain conditions.” Thermal variance seems to be the key phrase here.

Redditor /u/reggie_gakill posted this photo of his melted cable and connector in /r/Nvidia.

Now a Redditor has posted photos of a melted cable and connector on an RTX 4090. In the same thread, someone posted a second image of a melted connector as well. Additionally, when YouTuber Jayztwocents made a video about this controversy a while back, Nvidia told him there was nothing to worry about. It said it’s done its testing and there were no issues.

As far as Reddit goes, a rep for Nvidia has apparently reached out to the Reddit user. Now its PR chief says the company is investigating. In a statement to The Verge, Bryan Del Rizzo said, “We are investigating the reports,” adding they are collecting information from Reddit users.

Following this imbroglio, it’s being reported AMD will not be using that cable at all for its upcoming RDNA3 GPUs. Instead, it’ll use a traditional dual 8-pin configuration, at least on the high end. That’s what sources have told Kyle Bennet, formerly of [H]ardOCP notoriety. This was then confirmed by an AMD SVP. This applies to AMD reference cards only. It’s unclear if partner boards from MSI and others will also jettison the controversial connector. It’s also unknown if a cable capable of pulling 600W will even be necessary for AMD’s next-gen.

After all, if AMD’s high-end card only requires two 8-pin connectors, that’s a 300W budget. Add in 75W from the PCIe slot and you’re looking at a sub-400W GPU, compared with Nvidia’s 450W card. It’s possible that AMD’s solution is much more efficient than Nvidia’s as well. That’s simply because it will be the first consumer GPU with a chiplet architecture. This stands in contrast to Nvidia’s enormous monolithic die for Ada Lovelace.

Still, it’s a stumble for Nvidia’s newest GPUs. It remains to be seen what is the exact culprit for the melting behavior. When the GPUs launched some reviewers called out the scary amount of bending required to tuck the cables away. When you bend the adapter down to hide the cables, it puts a lot of pressure on the connector attached to the PCB. This could pry its connections loose from the PCB over time, some predict. The issue is exacerbated by how large the GPUs are too, as it can place the connector close to the side panel. As an alternative, CableMod sells a 90-degree adapter to prevent flex at the connector. The other alternative is to buy an ATX 3.0 PSU, but they are still not as prevalent as ATX 2.0 models, which are ubiquitous.

(Image: CableMod)

For now, we will have to wait and see what Nvidia has to say when its investigation is completed. People spending $1,600 on a GPU don’t necessarily want to drop another $200+ on a new ATX 3.0 power supply, so the adapters are necessary. But if those adapters are causing issues, they might not have a choice.

Nvidia and its partners have surely shipped thousands of GPUs to customers, if not more, and so far there are only two published reports of melting occurring. Still, it’s concerning it has even happened once.

Now Read: