Tuesday, 30 November 2021

Analyst: Apple’s AR Glasses Will be Just as Powerful as a Mac Computer

It’s been rumored for some time now that Apple has been working on some type of Augmented Reality (AR) device to pair with its iPhone, and now we have some spicy new rumors to add to the mix. According to famed analyst and Apple product future-seer Ming-Chi Kuo of TFI Asset Management, Apple is about a year away from launching the glasses, and they will be just as powerful as its low-end MacBook computers thanks to the inclusion of Apple’s vaunted silicon, according to CNBC.

As the report notes, it was previously assumed that Apple’s AR glasses would be similar to current AR offerings in that it would need to be tethered to another device, presumably an iPhone, to handle the actual heavy lifting involved in rendering 3D objects in space. But with the arrival of Apple’s power-sipping M1 silicon, it appears the company is now angling to use newly designed chips similar to the M1 as the brains of the operation. If this rumor pans out, it would indeed allow Apple to distance itself from its rivals in the still-nascent product category, as Kuo states the glasses would be able to operate independently of other devices, such as a Mac computer or iPhone. Kuo has stated that Apple’s goal is to replace the iPhone with AR within 10 years.

One of the promises of AR is this Mercedes feature, with the fishbones, or turning indicators, overlaid exactly where you’re supposed to turn.

Kuo goes on to predict the glasses will use two chips, stating, “The higher-end processor will have similar computing power as the M1 for Mac, whereas the lower-end processor will be in charge of sensor-related computing.” He believes the optical portion of the product will employ dual 4K micro OLED panels from Sony, and that it will be powerful enough to operate as its own independent product separate from the iPhone, with its own ecosystem of software and apps, according to 9to5mac. Kuo has previously stated the Apple glasses will also use Wi-Fi 6, which allows for maximum bandwidth with low latency; a requirement for any enjoyable AR/VR experience.

However, according to reputable Apple reporter Mark Gurman, Apple is actually making a headset first, positioning it as an expensive, niche product that will be followed up in the future with a product that looks more like glasses. Gurman predicts the initial product will be quite premium, with a price tag hovering near $3,000.

Though there’s not much happening in the AR glasses market at the moment, due to the fact that there’s still limited use cases for them and they usually need to be tethered to a powerful computer to function, Apple’s arrival in the market could possibly shake things up a bit, as Apple is wont to do. Apple is known for taking existing technologies and just making them easier to use, so it’s possible that’s what it is intending to achieve here. Either way, this is a product category that is seemingly heating up a bit, especially with Facebook changing its name to Meta, and beginning to talk about the hardware it’s working on to facilitate the eventual transformation of the human race into our avatars.

Now Read:

 



Xbox Series S Supposedly Black Friday’s Most Popular Console

(Photo: Mika Baumeister/Unsplash)
Every year after Thanksgiving a flurry of steals and deals takes America by storm, and discounts involving the newest gadgets—including pricey, hard-to-find video game consoles—enjoy a special level of anticipation. This Black Friday, the new Xbox Series S has beaten out every other console on the market.

According to the Adobe Digital Economy Index as reported by Business Insider, the Xbox Series S has proven itself the underdog this year, having unexpectedly overcome competitors from Sony, Nintendo, and even from within Microsoft. Based on “over one trillion visits to U.S. retail sites” that occurred during Black Friday 2021, Adobe calculated that the Xbox Series S was the best-seller.

Since its release about a year ago, the Xbox Series S has somewhat lived in the shadow of its fancier, more expensive counterpart, the Series X. While the Series X boasts nearly twice the storage, 4K compatibility, a disc drive, and a more powerful processor, the Series S forces gamers to own all their titles digitally and can’t manage quite the same level of graphical fidelity. The Series S is also smaller and less than half the weight, which can be a pro or a con, depending on one’s priorities. 

Though the Xbox Series X (pictured) is more powerful, the Series S is more affordable—and far easier to find. (Photo: Billy Freeman/Unsplash)

Though the Xbox Series X, PlayStation 5, and Nintendo Switch OLED have hoarded the limelight throughout most of 2021, a limited barrier of entry makes the Xbox Series S a far more accessible option. Hopeful gamers can easily find the Xbox Series S in stores and online, while the other consoles have been rendered tough to find by supply chain issues, like the ongoing global chip shortage. They don’t have to resort to buying from scalpers, who drastically inflate the price of available stock just because they can. There’s also something to be said about the more affordable pricing of the Xbox Series S, which comes in at $300. (The next-cheapest hot console is the $350 Nintendo Switch OLED; prices rise pretty quickly from there.) 

The chip shortage impacting console availability—and that of virtually everything else—isn’t expected to end until at least mid-2022, meaning the Xbox Series S may continue to enjoy the spotlight a little longer. The holidays prove a unique challenge-turned-opportunity, too; though many people try to get their holiday shopping done during Black Friday, the season for giving isn’t over yet, and supply chain issues are anticipated to continue as the year comes to a close. 

Now Read:



AT&T and Verizon Agree to Limit 5G Power to Resolve FAA Standoff

A 5G millimeter wave cell site on a light pole in Minneapolis.

The 5G rollout in the US has been slow and uneven. While some countries had a wealth of mid-band spectrum to use for 5G, pickings were slim in the US. That’s why Verizon and AT&T leaned so heavily on millimeter wave (mmWave) 5G early on. Both carriers are anxious to fire up their newly acquired C-band frequencies. Unfortunately, a disagreement with the Federal Aviation Administration (FAA) delayed those plans. Now, they’ve got a plan that will allow C-band to move forward in early 2022. 

According to a letter sent to acting FCC chair Jessica Rosenworcel and seen by Cnet, the parties have arrived at a compromise. AT&T and Verizon have agreed to limit the transmission power of their C-band equipment nationwide. In addition, they will enforce even more stringent limits on power near regional airports. With these restrictions, the carriers will move forward with the January 5th launch date. 

As wireless spectrum becomes ever more crowded, we’re learning that not all 5G is created equal. mmWave 5G operates at very high frequencies in the tens of gigahertz. These signals have high bandwidth — Verizon says its network can run at up to 2 Gbps — but the drawback is extremely poor range. C-band is lower, between three and four gigahertz. These waves used to be reserved for satellite TV, but improved efficiency means the C-band could be auctioned for 5G. 

FCC-Feature

AT&T and Verizon both snagged licenses for the upper C-band at 3.7-4GHz. One of the Federal Communications Commission’s (FCC) duties is to prevent wireless interference, and it initially gave the carriers permission to start broadcasting on those frequencies on December 5th. However, this part of the C-band is adjacent to frequencies used for radio altimeters. These devices monitor altitude using radio waves between 4.2 and 4.4GHz and are essential to landing systems. The FAA was worried that signals from Verizon and AT&T networks might leak over and interfere with these vital airwaves. 

The proposed limits on 5G aren’t necessarily in place forever. The carriers have pledged to keep things as they are for a period of six months, at which time regulators can evaluate any potential interference with radio altimeters. It’s unclear how much impact, if any, this change will have on coverage for the new mid-band networks. AT&T and Verizon need every bit of performance they can muster to compete with T-Mobile, which has a huge pile of mid-band spectrum from its Sprint acquisition.

Now Read:



Monday, 29 November 2021

Rumor Mill: Nvidia Prepping Flagship RTX 3090 Ti GPU

If there’s one thing the gaming world needs right now it’s another outrageously expensive and unobtainable graphics card, and Nvidia is heeding the call with reports of an alleged top-tier GPU waiting in the wings. Dubbed the RTX 3090 Ti, this full-blown Ampere card will offer the entirety of the GA102 die’s performance envelope, along with higher clocked memory from Micron to help distance the card from its lowly RTX 3090 predecessor.

The rumor springs forth from Twitter user Uniko’s hardware, who tweets the new GPU will feature new 21Gb/s memory from Micron, which is a bit of a boost from the memory used in the current RTX 3090, which runs at approximately 19.5Gb/s. The memory chips also feature twice the capacity of the previous modules, so only half the number will be required to reach the allotted 24GB of GDDR6X in the new GPU. This reduction in memory chips should allow the card to run a bit cooler, despite those chips requiring more power than the previous model due to the higher clock speeds. The GPU will also keep the same 384-bit bus as the current card, allowing it to theoretically offer up to 1TB/s of memory bandwidth.

Other salient specs include the full allotment of GA102’s horsepower, including 10,752 CUDA cores (up from 10,496), the aforementioned 24GB of a super-fast memory, and a total board power of around 400w or so, which is 50w more than the current RTX 3090. That’s an increase in core count of 2.5 percent, and when you throw in the faster memory, it seems wise to assume that overall the new GPU will be about five percent faster than the previous GPU. There is no information at this time about reported clock speeds, however. WCCFtech is reporting that despite being more powerful than the card it replaces, MSRP should remain the same at $1,499.

If this card does indeed exist, its arrival follows a pattern established by Nvidia in the past, where it released a cut-down version of its biggest die first, then follows up with an unfettered “big” version of the chip at the end of the product’s lifecycle, though previously it was branded Titan. Nvidia seems to have abandoned the Titan naming altogether for some reason, and is now swapping back-and-forth between Super and Ti for upgraded versions of its current GPUs. While it favored Super in the previous Turing era, it has switched to Ti for its Ampere upgrades.

Still, the news of this card’s imminent release begs the question, “Why?” We already have an RTX 3080 Ti, which is very close in specs to the RTX 3090, aside from having half the memory. Not to mention the fact that neither of these GPUs can’t be purchased for anywhere near their MSRP, leaving them costing over $2,000 on third-party sites, assuming you can even find one for sale.

Regardless, this rumored GPU is supposed to break cover in January 2022, perhaps as the company’s big announcement for CES. Go talk to your loan officer now, and watch this space.

Now Read:



El Salvador Wants to Build a Volcano-Powered Crypto City

(Photo: Harrison Kugler/Unsplash)
The world’s first “crypto city” is coming to El Salvador, and its cryptocurrency focus isn’t even the only thing that makes it exciting. With a good dose of fanfare, El Savador President Nayib Bukele announced this month that beginning in 2022, the country would be building a “bitcoin city” at the base of Conchagua, an oceanside volcano. 

The city will be backed by bitcoin-backed bonds with construction beginning 60 days after funding. During his announcement at the end of a weeklong bitcoin promotional event, Bukele postured the city as an ideal place for both residents and tourists, with restaurants, shops, homes, public transit, an airport, and other standard resources included. There will also be a central plaza in the shape of the bitcoin logo. 

But crypto-mining uses so much energy that whole countries’ electrical consumption pales in comparison, thanks to round-the-clock complex mathematical calculations that unlock new tokens. How will El Salvador’s new city support the vast amounts of power consumed by its reliance on bitcoin? That’s where the volcano comes in. As cool and alluring as a city at the base of a volcano may be, Conchagua plays a practical purpose: to power processing and verifying events on the blockchain with geothermal energy. This involves a system that will extract hot groundwater and convert it to steam, which will in turn power turbines and create energy.

Volcano San Vicente in El Salvador. (Photo: Oswaldo Martinez/Unsplash)

El Salvador already supports a quarter of its electrical grid with geothermal energy, which it harvests from a handful of the country’s 20 active volcanoes. In October its government set up a prototype for the crypto city’s system at the base of Tecapa volcano, though the results haven’t been particularly impressive so far. Geothermal energy production, while far “cleaner” than production involving fossil fuels, also impacts natural habitats surrounding the sites of groundwater extraction.

This isn’t the first time El Salvador’s government has aggressively pursued a bitcoin “first,” having made bitcoin legal tender alongside the US dollar and requiring that all businesses accept the cryptocurrency earlier this fall. (Whether El Salvadorian citizens like bitcoin’s sudden prevalence depends on who you ask.) These choices have prompted the rest of the world to question whether popular cryptocurrencies have a place in everyday business transactions. Alex Hoeptner, CEO of cryptocurrency exchange Bitmex, predicts at least five countries will accept bitcoin as legal tender by the end of 2022—but it’s unlikely these countries will have crypto cities powered by volcanoes. 

Now Read:



Apple Files Lawsuit Against NSO Group for its Pegasus Spyware Attacks

Apple talks up iPhone security, but Zerodium says it's falling behind.
(Image: Getty Images)
If you’ve been reading any news related to Cybersecurity in the past few years, you’ve certainly heard the name NSO Group before. The Israeli company has gained notoriety recently for its Pegasus software, which it licenses to governments and other nation-state clients to theoretically monitor terrorists, criminals, etc. However, recent investigations discovered Pegasus was installed on the smartphones of journalists, activists, and business leaders all over the globe. Due to this shocking discovery, Apple has announced a lawsuit against NSO Group and its parent company, seeking to prevent the group from using any of Apple’s services and hardware in the future, and thereby protecting its users from malicious attacks on their personal devices.

For a brief primer, Pegasus is essentially spyware that can be silently deployed against a target and used to monitor everything on a person’s mobile device. According to the filing, the Pegasus software was first identified by researches at Citizen Lab at the University of Toronto, where it was discovered Pegasus could initiate what is known as a “zero-click exploit,” meaning it could deploy without any input from the user. The attack, which Citizen Lab named FORCEDENTRY, worked in several stages. First, the company allegedly contacted Apple’s servers in the US to identify other Apple users, then worked to confirm the target was using an iPhone. Next it sent “abusive data” to the target via iMessage, which disabled logging and allowed it to upload a bigger file, which was the payload. That bigger file was stored on iCloud servers, then delivered to the targets’ phones. One the Pegasus payload was in-place, it began communicating to a command-and-control server, whereby a person could send commands to the phones. This allowed 3rd parties to control the phones remotely, vacuuming up call logs, web browser history, contacts, and even let them turn the phone’s microphone and camera on, and send what it captured back to the nefarious server. A consortium of global journalists launched an investigation in July into this situation, dubbed the Pegasus Project, and found, “Military-grade spyware licensed by an Israeli firm to governments for tracking terrorists and criminals was used in attempted and successful hacks of 37 smartphones belonging to journalists, human rights activists, business executives and two women close to murdered Saudi journalist Jamal Khashoggi.”

Image from an NSO Group brochure posted on SIBAT (The International Defense Cooperation Directorate of the Israel Ministry of Defense). (Image: Citizen Lab)

This seems like pretty standard spyware stuff, but what’s so remarkable about it is the zero-click aspect, as typically a user has to initiate the deployment of malware/spyware by clicking on a link sent to them, or take some kind of action. Not this time. This type of activity is only possible because NSO Group and other companies like it employ researchers who work to discover unknown vulnerabilities in popular software such as iOS, Microsoft Windows, and others, and use these gaps in security to develop software that can penetrate target devices before the developer catches on that there’s a flaw. The security holes are typically known as Zero Days, because the developer has had zero days to fix the security flaw. Companies like Apple, Microsoft, Google and others have massive cyber security teams of their own who work to find these security flaws before rogue actors do, but given the complexity of the software involved, it’s a never-ending battle against companies like NSO Group. Also, in September Apple patched the vulnerabilities that allowed Pegasus to run with its iOS 14.8 update, and in its press release the company notes, “Apple has not observed any evidence of successful remote attacks against devices running iOS 15 and later versions.”

This is not the first time NSO Group has been in the headlines. The US government blacklisted the company earlier this month, “after determining that its phone-hacking tools had been used by foreign governments to ‘maliciously target’ government officials, activists, journalists, academics and embassy workers around the world,” according to The Post. The company is also embroiled in a lawsuit with WhatsApp over claims its spyware was used to hack 1,400 users of its app. Earlier this month, the Ninth Circuit Court of Appeals rejected NSO Group’s claim that it should have “sovereign immunity” in the case.

If you’re interested in a deep-dive on the NSO Group, the podcast Darknet Diaries recently posted an episode about it, including an interview with the Citizen Lab researchers that discovered Pegasus. You can also read Apple’s full complaint right here.

Now Read:



NASA Launches DART Asteroid Deflection Mission

Earth has been pelted by space rocks on a regular basis for the entirety of its existence, and there’s nothing stopping it from happening again. The next big Earth impactor is already out there, and eventually, it’ll make itself known. For the first time, there’s a chance we could stop such an object from clobbering the planet. NASA just launched the DART spacecraft early Wednesday morning (Nov 22) on a mission to test asteroid redirection technology. Next year, it will collide with a space rock called Dimorphos with the aim of changing its orbit. 

NASA partnered with SpaceX to launch DART (Double Asteroid Redirection Test) aboard a Falcon 9 rocket. Its departure from Earth went off without a hitch, too. The spacecraft’s target is a binary asteroid system consisting of Dimorphos (the target) and 65803 Didymos, the larger asteroid around which Dimorphos orbits. These objects cross Earth’s orbit, making them potentially hazardous in the future. 65803 Didymos is almost a kilometer in diameter, so it could cause major devastation if it were to hit Earth. Dimorphos (formerly known informally as Didymoon) has a diameter measured in the tens of meters, small enough that DART might be able to knock it off course. 

Dimorphos and Didymos make an ideal system to study the effectiveness of asteroid redirect technology. NASA will analyze the orbit of Dimorphos before and after the impact to see how it was affected. Dimorphos will become the smallest celestial body ever visited by a human spacecraft, making the mission that much more challenging. The 1,100-pound (500 kilograms) DART needs to line up its attack run and hit Dimorphos at 6.6 kilometers per second, all while the NASA team is 6.8 million miles away—it’s all automated. 

In addition to smashing itself to bits, DART will deploy a small secondary satellite from the Italian Space Agency called LICIACube. This spacecraft will witness DART’s demise from 34 miles (54 kilometers) away and measure the amount of debris kicked up from the impact. 

To judge the success of the mission, astronomers will look to see how fast Dimorphos completes an orbit around Didymos. Currently, the smaller rock orbits the larger one every 11.9 hours. If the orbit speeds up by just 73 seconds, the mission will be deemed a success. Although, some members of the team are expecting a much larger change on the order of ten minutes. If we can manage that with the relatively small DART impactor, it should be possible to nudge a killer asteroid into a safer orbit with similar technology. The real challenge might be spotting the target before it’s too close to redirect. We’ll know if humanity has a new tool to preserve life on Earth sometime next fall.

Now Read:



AMD Allegedly Jacking Up RX 6000 GPU Prices by 10 Percent

Here’s some more bad news for gamers; that outrageously priced GPU you can’t find in stock is about to get a little bit more expensive. Back in August, TSMC began warning its partners that wafer costs would be going up soon, and we now have the first semi-official confirmation that is indeed happening. A post on the Board Forums alleges AMD has sent a notice to all of its Add-in Board Partners (AIB) that it’s increasing the price of its RX 6000 series GPUs by 10 percent across the board, according to Videocardz. This pricing change will apparently occur in the next shipment of GPUs to its partners, which will apparently drive up the price of these GPUs by $20 to $40 USD. This news arrives just in time for the holiday shopping season, when demand for GPUs is expected to increase even more, as if that is even possible.

According to a translation of the board posting, AMD is citing TSMC wafer costs as the reason for the change, and as we reported earlier, sub-16nm prices, including 12nm, 7nm, and 5nm, are said to have increased roughly 10 percent, while TSMC’s older nodes have gone up by as much as 20 percent. AMD seems to be passing this price increase along to its partners, who in turn are passing it along to us, the customer, or the scalper, as it were. Then the scalper passes it along to us, the gamers. Although, as Videocardz points out AMD also produces its CPUs at TSMC and there hasn’t been a similar across-the-board increase, which is curious.

Overall, this is just another blow for gamers hoping to find a GPU at a reasonable price any time soon, and it’s also not surprising either. Not only did TSMC announce its prices were going up a few months ago, but every single bit of news lately regarding the global supply chain, GPU pricing, and the cost for anything that allows us to have any fun has been along the lines of, “the problem is not going to get better, it’s only getting worse.” We reported in September that GPU prices were rising again after a brief lull in August, adding that companies like Nvidia were indicating the current shortage (and associated price increases) might last at least until this time next year.

Trying to find a GPU for sale that is anywhere near MSRP is a fool’s errand these days.

To its credit, AMD does theoretically sell cards at MSRP direct from its website, but we’ve never actually seen one in stock there, although it’s not like we check it daily. Nvidia used to sell its Founder’s Edition cards directly from its website as well, but that situation was such a debacle that it now just directs customers to channel partners like Newegg and Best Buy so they can see for themselves if the cards are in-stock (they’re not).

Now Read:



Wednesday, 24 November 2021

Super-Powered Gameboy Advance Runs PS1 Games

If finding a way to play PlayStation 1 games on a Gameboy Advance was one of your 2021 resolutions, you’re in luck. A modder by the name of Rodrigo Alfonso has hacked a cartridge for the 20-year-old handheld, allowing them to play 3D games like Crash Bandicoot and Spyro: Year of the Dragon on a console that was never made to do so.

Alfonso hacked the Gameboy Advance by moving its game processing and rendering to a custom cartridge, which contained a Raspberry Pi 3 with a PlayStation emulator. The Raspberry Pi allows Alfonso to stream any RetroPie-compatible game to the Gameboy Advance through the device’s link port. 

Alfonso is limited to streaming very low-res graphics—240×160—but that’s kind of the point. Depending on how Alfonso feels about framerate when they run a game, they can elect to stream at 240×160 or 240×80 and enjoy the retro vibe intense scanlines offer, or they can go for 120×80 and utilize a slightly brighter mosaic effect. The display itself is illuminated using a backlight mod with an AGS101 display. Alfonso shared videos of their success with the mod on their YouTube channel and provided replication instructions on GitHub. 

Because RetroPie is capable of emulating a wide range of retro consoles from the Nintendo Super NES to the Atari 7800, Alfonso’s hacked Gameboy Advance is able to play far more than just PS1 games. (One of Alfonso’s videos depicts Battletoads and Super Mario RPG: Legend of the Seven Stars being played on the device, and both games were developed for the NES). But given that the PS1 wasn’t a handheld and was manufactured by an entirely different company—versus the other Nintendo consoles the modded Gameboy Advance can now emulate—Alfonso’s ability to theoretically play Metal Gear Solid on the 2.9-inch screen is extra impressive.

The Gameboy Advance is no stranger to modification. Tinkerers have long sold custom cartridges for the old-school device, allowing users to basically make their own game cartridges, even with multiple game files on one “flash cart.” Unfortunately, no matter how good their console hacking skills or comprehensive their emulators, fans of the original Gameboy Advance are unlikely to ever experience PlayStation 2+ games on the tiny screen, given the later models’ incorporation of an analog stick into their controls. 

Now Read:



Rolls-Royce ‘Spirit of Innovation’ All-Electric Aircraft Smashes World Records in Latest Flight

Rolls-Royce has been working on an all-electric plane, called “Spirit of Innovation,” and the automaker reports that the aircraft absolutely clobbered at least three world records in their latest test flights. For an aircraft that’s only been in the air for a few hours total, that’s pretty impressive — this report comes not quite two months after the aircraft took off for its maiden flight.

After the flights, the company announced: “We have submitted data to the Fédération Aéronautique Internationale (FAI) – the World Air Sports Federation who control and certify world aeronautical and astronautical records – that at 15:45 (GMT) on 16 November 2021, the aircraft reached a top speed of 555.9 km/h (345.4 mph) over 3 kilometres, smashing the existing record by 213.04 km/h (132mph).”

They go on to say that the day’s later test flights also belong in the record books. While they didn’t push the rest of the later flights quite as fast, at least one still hit 330mph. They also report breaking the fastest time to climb to 3km altitude by an entire minute, clocking in at a final 202 seconds, as well as breaking two other speed records over distances of three and fifteen km, respectively. The flights took place at the UK Ministry of Defense’s Boscombe Downs aerodrome: an airfield not unlike Edwards AFB, used for testing new and experimental aircraft.

Spirit of Ingenuity uses liquid-cooled Li-ion batteries, and a 400kW power train developed with partners Electroflight and YASA, also of the UK. The single-seat aircraft has an ultralight carbon-fiber hull, and while it boasts the ability to put forth 500+ hp, it can land with two of its three batteries disabled.

While this one plane won’t revolutionize the industry, it does provide much-needed data for the idea of urban and commuter aircraft. “The characteristics that ‘air-taxis’ require from batteries,” said Rolls-Royce, “are very similar to what was developed for the Spirit of Innovation.”

The company added, “The advanced battery and propulsion technology developed for this programme has exciting applications for the Advanced Air Mobility market.”

There’s even a practical angle to the development of all-electric aircraft. The UK’s Business Secretary, Kwasi Kwarteng, said: “The government is proud to back projects like this to leverage the private investment necessary to unlock cleaner, greener aircraft which will allow people to fly as they do now, but in a way that cuts emissions.” R-R CEO Warren East added, “Following the world’s focus on the need for action at COP26, this is another milestone that will help make ‘jet zero’ a reality and supports our ambitions to deliver the technology breakthroughs society needs to decarbonise transport across air, land and sea.”

Now Read:



Report: Windows on ARM Is Exclusive to Qualcomm, But Not For Much Longer

Qualcomm support could give Windows a boost going forward.

For years, Windows PCs have flirted with ARM processors, but none of the hardware was good enough to compete with x86 chips from Intel and AMD. Now, we might know why. According to a new report from XDA Developers, Microsoft has an exclusivity agreement with Qualcomm that prevents other vendors’ ARM designs from integrating with Windows. That’s the bad news. The good news is it will expire soon. 

Microsoft and Qualcomm teamed up in 2016 to launch Windows 10 with support for ARM. Unfortunately, the difficulty of making software run well on Windows for ARM has impeded growth. This wasn’t even Microsoft’s first attempt at supporting ARM. In the Windows 8 era, it launched Windows RT for 32-bit ARM chips. It was a spectacular disaster, owing to the lack of app support. Newer chips from Qualcomm didn’t really help, either. The available hardware, like the Snapdragon 7c and the Microsoft-exclusive SQ1 leaned on Qualcomm’s mobile chip designs — they just didn’t have enough power for Windows. 

According to XDA’s source, the Microsoft x Qualcomm collab guaranteed the latter a period of exclusivity for Windows on ARM. That explains why we’ve only seen a handful of Qualcomm-powered Windows machines despite years of effort. It could also shed light on why there is still no way to run Windows 11 on the new M1-based Apple machines. 

Apple’s M1 chip proves ARM is ready for real computers.

We don’t know how long that deal was supposed to last or when exactly it will run its course, but we are assured it’s not too far off. That matches what we’re seeing in the market. MediaTek is gearing up to launch its own high-end ARM chips for Windows. Meanwhile, Qualcomm recently talked about its new generation of CPU designs from the newly purchased Nuvia team. These cores are designed for PCs, so they should be much more powerful when used in notebooks. 

These companies have made it clear they feel the transition from Intel to ARM is inevitable. Apple has already shown that ARM is a viable architecture for Windows, and perhaps, Qualcomm’s exclusivity has only served to slow down the transition on Microsoft’s side. It might be a good time to nurse along your ailing Windows laptop. There could be big changes afoot in the coming year, provided that mysterious exclusivity times out. Five years seems like plenty of time for Qualcomm to make a go of things.

Now Read:



Tuesday, 23 November 2021

NASA Delays Webb Telescope Launch Following an ‘Incident’

NASA has been working on the James Webb Space Telescope for 20 years, and there have been numerous delays. The marvel of astronomical technology is currently preparing for launch, but NASA says we’ll have to wait just a bit longer. Following a minor “incident,” NASA has pushed the launch of Webb back by four days. That will give the team time to check for damage one last time before launch. 

The Webb telescope will serve as the successor to Hubble, which has survived long past its intended design life. With the aging telescope on the verge of failure on an almost weekly basis, the need for Webb has never been greater. Of course, it was supposed to be in operation years ago, but building the most powerful space-based observatory in human history is no simple feat. 

Several weeks ago, Webb made its journey from the US to French Guiana, where NASA’s European partners will launch the spacecraft aboard an Ariane 5 rocket. However, NASA says that an “incident” occurred while technicians were mounting the telescope to the launch vehicle adapter, which mates the observatory to the upper rocket stage. According to NASA’s initial report, a clamp band used to secure the telescope to the adapter was accidentally released. This “caused a vibration throughout the observatory.”

Webb arrived at the launch site by barge several weeks ago.

Webb is going to have to cope with intense vibration during launch, but there’s no reason to take any chances here. Webb’s total price tag is hovering around $10 billion, but that’s nothing compared to the time it took to design and build, making it the very definition of “irreplaceable.” NASA has convened an anomaly review board that will investigate the incident and conduct additional testing to ensure the observatory is still in perfect working order. Once it’s deployed, Webb will be too far away for any maintenance missions. 

Hopefully, we’ll hear in the coming days that the telescope is fine, and the four-day pause will be the last delay before Webb finally leaves Earth behind. When it’s finally operational, Webb will be able to peer at more distant, dimmer objects than any other instrument in the world from its vantage beyond the orbit of the moon. It could help us understand the dawn of the universe, the life and death of stars, and even help study exoplanets that could harbor life. We just need to get it into space in one piece.

Now Read:



Newly Announced Exoplanet-Hunting Space Telescope Funded by Breakthrough Initiative

Image by Wikipedia. Alpha Centauri AB is on the left, Beta Centauri on the right, and Proxima Centauri is at the center of the red circle.

Move over, James Webb: humanity is about to get another eye in the sky. There’s just been a new space telescope announced, named TOLIMAN, and it’s already got funding from the Breakthrough project.

The telescope is designed around two things: its target, and the exotic optics the telescope will use. TOLIMAN’s mission is to point directly at the Alpha Centauri system, in order to search for potentially habitable exoplanets there. The system actually contains three stars; Proxima Centauri  (inside the red circle above) is confirmed to host a rocky planet in its Goldilocks zone, and there are likely several other planets elsewhere in the system.

Proposed design of the TOLIMAN space telescope. Credit: University of Sydney.

The name TOLIMAN stands for “Telescope for Orbit Locus Interferometric Monitoring of our Astronomical Neighbourhood.” Clunky, we know. But the acronym was chosen in homage to the space telescope’s target star system. Toliman is the official name of a star within the Alpha Centauri system: α Centauri B, the smaller and cooler of the binary pair around which Proxima Centauri orbits.

The 30-cm telescope is surprisingly small, for a space telescope, but its target is Earth’s nearest neighboring star system: Alpha Centauri. Alpha Cen is the brightest star in the constellation Centaurus, visible in the southern sky, and it’s about 4.3 light-years from Earth.

Image: Stellarium, via NASA

The system was first documented by Arabic astronomers during the Golden Age of Islam. The word “Toliman” itself is the Latinized version of an ancient Arabic name for Alpha Centauri, which meant “the Ostriches.” But two other stars in the Southern sky already bore that name, so to bring the name of the new star into accord with the constellation in which it was found, it was later renamed Rijl al-Qiná¹­Å«rus. This in turn was Latinized to Rijel Kentaurus, “the Centaur’s foot,” which is where TOLIMAN is going to point.

TOLIMAN’s exotic optics are its other keystone feature. The telescope will use a “diffraction pupil lens” for its observations. Multiple overlaid structural patterns are arranged on the surface of the lens, so that the different areas separate incident light by its phase.

Top: Black and white regions show three overlaid patterns on the diffraction pupil lens; black and white regions in each are in antiphase. They perform a kind of binning by phase. Bottom: The point-spread field associated with each. Right: “The patterns to the left illustrate 3 separate log-harmonic spirals, while to the right is the combined effect of the sum of all 3 log-harmonic spirals (upper) together with the corresponding PSF (lower).” Credit: study authors.

Because of this ability to distinguish separate sources, this design lends itself very well to studying Alpha Cen in particular. The system’s binary pair are only about 23 AU apart — about the distance from the Sun to Uranus. That means they have very little angular separation between them. Even so, the two stars can be clearly distinguished by the diffraction pupil design, because where they might overlap visually, using this lens means that different light sources stand out from one another in an obvious, kaleidoscopic way:

Different light sources really do stand out quite sharply from one another. Image: Fig. 3, TOLIMAN project abstract; Tuthill et al., 2018.

The international collaboration is led by Peter Tuthill of the Sydney Institute for Astronomy, and it includes teams from the University of Sydney, Breakthrough Initiatives, Saber Astronautics, and NASA’s JPL. Jason Held, CEO of Saber Astronautics, described TOLIMAN in a press release as “an exciting, bleeding-edge space telescope,” one that will be “supplied by an exceptional international collaboration. It will be a joy to fly this bird.”

Now Read:



WSJ: Samsung’s New $17B Chip Plant Will be in Taylor, TX

(Photo: Alerkiv/Unsplash)
Samsung has announced the location for its upcoming chip plant as it seeks to “chip” in to the widespread semiconductor shortage. The company has chosen Taylor, Texas as the site for the $17 billion manufacturing hub, according to sources for the Wall Street Journal. Texas governor Greg Abbott is expected to make an announcement regarding the plant later today. 

Samsung’s new plant will use up about 1,200 acres of land and will bring 1,800 jobs to Taylor once production kicks off in 2024. The facility is part of Samsung’s $205 billion investment in chip manufacturing and biotech, which will be the company’s focus over the next three years.

Samsung has been preparing to build a new US-based chip plant for a while now, but details about its plans have been few and far between. There was even a period of time in which Samsung (alongside TSMC and Intel) threatened to pull the plug on its plans, should the locations under consideration not offer generous incentives to build (i.e. tax breaks). Clearly, Samsung’s concerns were rectified. Though the company is said to have also courted Austin, Texas, Taylor reportedly offered better tax incentives—though Samsung’s original US-based plant in Austin is expected to remain in operation following the new plant’s opening.  

Samsung’s Austin manufacturing facility. (Photo: Samsung)

The conglomerate’s plan to open a new chip plant dovetails with a global chip shortage ExtremeTech readers are by now intimately familiar with. Samsung Vice Chairman Lee Jae-yong reportedly visited the United States to speak with White House officials about the shortage and discuss federal incentives for chipmakers. According to sources for The Korea Times, Samsung planned to reveal the location for the plant once Jae-yong returned home to South Korea this week, but now the cat is out of the bag.   

The Biden administration has been pushing for increased US chip manufacturing, recommending earlier this year that a cool $50 billion be put into research, development, and actual production of the highly sought-after resource. Semiconductors are in just about everything these days, and production delays related to Covid-19 have drastically impacted output, causing the tiny chips to be in short supply. Given that the shortage isn’t expected to end anytime soon, manufacturers and government officials alike see semiconductor fabrication as an opportunity to bolster the US economy and compete with countries such as China and Taiwan, which currently lead in production.

Now Read:



Wear OS Is Growing Again, and Google Has Samsung to Thank

Google was the first big tech firm to invest in the modern smartwatch when it worked with LG, Motorola, and Samsung to release Android Wear devices in 2014. It would be another year until the Apple Watch debuted, and it was all downhill for Android Wear after that. The branding change to Wear OS didn’t help, but partnering with Samsung did. This fall, Samsung launched the Galaxy Watch4 line, its first smartwatches running Android since 2014. Last summer, Wear OS was languishing at four percent of smartwatch shipments, and now it’s at 17 percent, according to Counterpoint Research

Samsung’s history with Android-powered watches actually extends further back than Wear OS. Before Google decided how an Android-powered watch should work, Samsung made its own version of Android to run on the Galaxy Gear. It later updated that watch to the Tizen OS, which almost all of Samsung’s smartwatches have used instead of Android. There was, of course, the Gear Live, one of Google’s Android Wear launch devices. However, it was all-in with Tizen after that. 

By coming back to Wear OS, Samsung gained access to a much larger collection of software. Even with years of mismanagement from Google, the Play Store has much more software for smartwatches than Samsung could ever collect in the Tizen store. Google, meanwhile, gets Wear OS on hardware that Samsung will market harder than any other OEM. And it’s working. 

In the third quarter of 2021, Wear OS shipments reached 17 percent of all smartwatches. That’s a strong number two finish behind Apple Watch OS at 22 percent. It’s also a huge boost over the four percent number of the previous two quarters. Samsung has reason to celebrate, too. The move to Wear OS has helped to increase Samsung’s share of the market. It has overtaken Huawei this year, rising to the number two brand after Apple. 

Apple’s share of the market has dropped 10 percent over the past year, but the reason for that is clear. Apple was unable to launch its Series 7 watch alongside the new iPhones. The delayed device launched in the fourth quarter, so those numbers are not included in the report. It’s likely Apple’s new watch will give it a boost in the fourth quarter, and that could make Samsung’s year-end totals less impressive. 

Still, this is a turn-around for Wear OS, which is something Google desperately needed. Now it has to capitalize on it. Hopefully, Google doesn’t squander its second chance to be prominent in wearables like it did the first time.

Now Read:



Owners Resort to Hacking Smart Treadmills After NordicTrack Locks Them Out

It’s natural to expect that if you buy something, you can do whatever you want with it. However, the complexity of laws around intellectual property has made that difficult. The right to repair movement is gaining steam with even Apple loosening restrictions on tinkering with your own hardware. NordicTrack is not so enlightened, though. After customers started installing their own apps on the company’s $4,000 X32i smart treadmill, it released a software update that locked them out. Owners aren’t happy. 

Exercise equipment is smarter than ever before. Companies like Peleton have made boatloads of cash by integrating subscription training services with the hardware, and that’s what NordicTrack does. The X32i is a spendy treadmill with a huge 32-inch touchscreen display, which delivers fitness content from NordicTrack’s iFit, a service that costs $39 per month. You’re probably familiar with these services, if only by reputation. There are suspiciously perky trainers urging you on, online leaderboards, 1-on-1 help, and more. 

Until now, anyone who wanted more from their $4,000 treadmill could simply unlock the device’s underlying Android OS. According to owners, the process was simple and documented in NordicTrack’s help documents. Just tap the screen ten times, wait seven seconds, and tap ten more times. With access to the Android UI, you can sideload the apps of your choice and even use the browser to access a world of content online. 

NordicTrack uses the screens on its smart exercise equipment to deliver a variety of workouts for $39 per month.

In October, NordicTrack started rolling out an update that removed the so-called “privilege mode” from all its connected workout machines. According to NordicTrack, this is just about safety. Since the software can control the mechanical components of the treadmill, it doesn’t want people to install third-party apps in a public setting (the X32i is available to both consumers and commercial buyers). Owners who relied on sideloading have suddenly found their expensive treadmills are much less useful, and they’re scrambling to find workarounds. So far, the best they’ve found involves factory resetting the treadmill. It restarts with old software that includes privilege mode. Then, you have to block NordicTrack’s update servers at the network level to keep the new software from asserting itself. 

NordicTrack says anyone who has used a workaround to access privilege mode could find their warranty voided. That hasn’t stopped owners from trying to regain some of that lost functionality. NordicTrack can swear up and down this is a safety issue, but there are smarter ways to protect machines in public settings. For example, an administrator account for personal use, which is simple to implement in Android. If anything, this sounds like NordicTrack is doing whatever it can to keep people paying $39 every month for content on that big 32-inch screen. You know what doesn’t have any content restrictions? A TV. You can just put one of those in front of a cheaper treadmill, just as our ancestors did.

Now read:



Monday, 22 November 2021

Winamp Prepares to Relaunch: Can it Still Whip the Llama in 2021?

It’s not a new development in the tech world that bringing back “formerly loved” items from the past is cool once again, but that usually applies to old hardware like gaming consoles, smaller phones, and so forth. This time around though, it’s software that’s attempting a comeback, from a bygone era. Winamp, the formerly hugely popular music player, has plans to relaunch in 2021 according to a report by Bleepingcomputer. What’s surprising about this announcement is the software hasn’t been updated since 2013, and as we noted, people don’t use media players like Winamp anymore.

First off, if you’re under the age of 30 and are reading this, some explanation is required. You see, back in the 2000s, digital music wasn’t really a thing yet, so we used to take our music CDs and extract the files into MP3 format. This conversion allowed us to reduce the file size immensely, and also transfer the files to a portable music player like the iPod, or for a handful of folks, a Zune player. The small file size also fueled the explosion of P2P file-sharing. Though we had our mobile music needs met, we also needed software to play music on our PCs, and for that a lot of people used Winamp, ourselves included. It offered a ton of cool skins, had a visualizer, and was just fast and free, two things we appreciate in every piece of software.

Installing Winamp in 2021 certainly brings back some good memories. (The fact that you wrote an explainer for this hurt me in my soul. -Ed). 

The big question now is, since everyone uses streaming services like Spotify and Apple Music to listen to their tunes, what place does Winamp even have in today’s market? According to the website, the software isn’t just updated, it’s “remastered,” with the goal of becoming the one app you can use to connect to your favorite artists, which includes podcasters. Winamp will apparently not only be marketed to end users who just want to consume some content, but artists and creators as well who are unhappy with the arrangements provided by today’s most popular streaming services. The site states, “For artists and audio creators we’re all about giving you control over your content. We’ll help you to connect closely with your fans and earn a fairer income from doing what you love.”

Judging by all this marketing copy, the company seems intent on leveraging its nostalgic connection to it’s “80 million” users around the world, but whether it can do so in a world that has collectively moved on to an entirely new format for music consumption remains to be seen. That said, if you are curious about what the company has coming down the pike you can download the latest version from its website,  to get a feel for it. We installed it and it looks exactly like we remembered it from so many years ago. The company is also asking people to sign up for its upcoming Beta version, which will supposedly offer all the new features the company is currently teasing via its website.

Now Read:



Rumor Mill: Apple Working on an M1 Max Duo SoC for Upcoming iMac Pro

(Image: Apple)
It’s not a big industry secret that Apple is working feverishly behind the scenes to excise every trace of Intel silicon from its Mac lineup. The company announced plans to switch from Intel to its own custom chips several years ago, and in 2020 it began the replacement process with the original M1 chip landing at the very bottom of its lineup with the MacBook Air, Mac Mini, and the entry-level 24″ iMac. Next up was the upgraded M1 Pro and Max SoCs, which landed in the company’s revamped 2021 MacBook Pros. This leaves just two models left on the upgrade path: the “big” 27″ iMac, and the pinnacle of power, the Mac Pro, both of which still use Intel processors and AMD discrete GPUs. According to a report, one of those chips will break cover soon; an M1 Max Duo SoC the company plans to drop into an all-new iMac Pro.

As its name implies, the M1 Max Duo will reportedly be two M1 Max chips connected together for double everything the Max chip offers. This translates to a 20-core CPU, and 64-core GPU, along with the ability to boast up to 128GB of RAM. There’s also Mac Pro rumors suggesting an M1 Max Quadro, with a 4x design. This is huge upgrade from the original M1 chip, which has just eight CPU cores and seven GPU cores, along with a maximum of 16GB of memory. The sources of these rumors are twofold: Bloomberg journalist Mark Gurman, and Hector Martin, who is porting Linux to Apple silicon Macs.

Apple’s M1 Max chip is a beast, but what if it was doubled or even quadrupled? (Image: Apple)

Starting with Gurman, who is a noted Apple insider, he pointed out in a recent tweet that Apple is indeed working on taking the M1 Max die and simply multiplying it both 2x and 4x for upcoming desktop chips. In his tweet he writes, “…the new Mac Pro desktop is expected to come in at least two variations: 2X and 4X the number of CPU and GPU cores as the M1 Max. That’s up to 40 CPU cores and 128 GPU cores on the high-end.” This dovetails with info from Mr. Martin, who has been elbow-deep in the MacOS code and reports, “…the macOS drivers have plenty of multi-die references, and the IRQ controller in the M1 Pro/Max is very clearly engineered with a (currently unused) second half for a second die.” If that’s not enough information from you, he adds, “For the technically minded: it’s a second set of config/mask/software-gen/hw-state registers, and the hardware inputs are all idle but you can software-gen IRQs in that block just fine and they get delivered with a die-id of 1 in the top 8 bits of the event register.” If you’re more into the video thing, YouTuber MaxTech goes into significant detail about all of these rumors.

For those of us who are silicon aficionados, it’s been fascinating to watch Apple’s moves in this market, as the M1 chips have upended the notion of what we can expect from a mobile CPU by offering both blistering performance and incredible efficiency; a rare feat indeed. Which is why the prospect of Apple delivering a chip that doesn’t need to go into a mobile device is alluring, as they can theoretically unleash the hounds since they won’t need to worry about power consumption, within reason at least. The bad news, however, is the same tipsters who are offering these tantalizing leaks are also pointing to insane price tags for this much power, with one pointing out the top-end Mac Pro could cost around $50,000. This shouldn’t come as that big of a surprise though, as you can already spend that much money quite easily on the current Xeon-powered Mac Pro tower, even without the $400 wheels.

Now Read:



Intel Shows Off Next-Gen Chips at Fab 42

(Photo: Stephen Shankland/Cnet)
Cnet reporter Stephen Shankland recently took a tour of Intel’s sprawling Fab 42 in Chandler, Arizona, and Intel let him take a peak at a lot of its upcoming chip designs. The result is a photo gallery that is verified to be Not Work Safe if you’re as into ogling Silicon Wafers as we are.

As we wrote in 2017, Fab 42 was originally designated as the place where Intel’s future chips would be made on a 7nm process. Back then that size of node was very forward-looking as Intel was still struggling with its 10nm development, which it recently resolved with the launch of Alder Lake. Now Fab 42 is humming right along on the company’s next-gen products, and just as we predicted four years ago it’s using Extreme Ultraviolet Lithography, or EUV. This 7nm process is also known as Intel 4, with the 4 representing 4 angstroms, which is the unit of measurement below nanometer. One angstrom = 100 picometers, while one nanometer = 1000 picometers. Alder Lake was made on an Intel 10nm process, which is called Intel 7.

Highlights of Shankland’s trip include a look at the company’s 2023 chips, which are dubbed Meteor Lake. These 14th generation chips are significant because they are the first client-oriented CPUs to utilize an all-new chipset design, as opposed to the monolithic design it’s used in previous chips. This represents a stark departure for Intel, but as nodes get smaller and smaller both AMD and now Intel have turned their attention to chiplets and packaging as the key to improving performance in their next-gen offerings. Intel’s future Meteor Lake will use Foveros technology to stack chiplets vertically, as opposed to side-by-side. The chips shown to Cnet lacked functioning processing circuitry, and were apparently just being used to test the fab’s packaging functionality.

Intel’s 7nm Meteor Lake CPUs (Image: Cnet)

Other notable appearances include Intel’s absolutely massive Ponte Vecchio chip, which was designed to power the Department of Energy’s Aurora supercomputer. The chip combines every next-gen technology Intel is currently pursuing, with 47 separate chiplets connected laterally with Embedded Multi-Die Interconnect Bridges (EMIB), and vertically with Foveros stacking.

Intel’s gargantuan Ponche Vecchio processor (Image: Cnet)

Intel giving reporters a look inside its facilities seems like part of its multi-year plan to once again become known as an engineering powerhouse; a mantle it has seemingly lost to rival TSMC in the recent past. Part of this strategy includes Intel offering its silicon fabrication services to other companies, which is it now doing under the moniker Intel Foundry Service, and has even gone so far as to say it hopes to win back Apple’s business, as the iPhone maker famously jettisoned Chipzilla’s CPUs in favor of TSMC. That goes the same for AMD, which also switched to TSMC for its latest chips. As proof of Intel’s commitment to regaining the mantle of engineering supremacy, Cnet notes Intel is currently ramping up two more fabs in Arizona — Fab 52 and 62 — at the cost of $20 billion, with plans for third fab that will cost a whopping $100 billion, location unannounced thus far.

Overall, Shankland’s article provides a fascinating look inside Intel’s operations, and the photo gallery that accompanies it is not to be missed.

Now Read: