Monday, 31 January 2022

Intel CEO on the GPU Shortage: ‘We are On It’

PCGamer recently penned a cry for help to Intel’s CEO Pat Gelsinger on behalf of gamers worldwide who haven’t been able to buy a decent GPU at a reasonable price in a few years. The message was simple: you’re the only company that can step in and end this madness, thanks to the impending arrival of your Arc Alchemist lineup of discrete GPUs. Surprisingly, Intel has responded, saying it’s aware of the issue, and it does indeed plan on being the savior we all hoped would come along some day.

PCGamer’s letter is titled, “An open letter to Intel: Help, you’re our only way out of the GPU crisis,” and it implores the company to enter the GPU market with high volume and reasonable prices. This should theoretically apply pressure to both Nvidia and AMD, who so far have been unable or unwilling to do anything to curb the insane pricing of graphics cards. The RTX 3050 and RX 6500 XT are essentially remakes of previous GPUs with mostly useless ray tracing bolted on and higher price tags.  And don’t even get us started on the “refreshed” RTX 3080 12GB, which is almost double the price of the original RTX 3080.

Intel graphics chief Raja Koduri was first to respond to PCGamer, and as the former architect for AMD’s GPU division he has an intimate knowledge of the industry. “I am with you, @pcgamer. This is a huge issue for PC gamers and the industry at large,” he wrote. “@IntelGraphics is working hard to find a path towards the mission – getting millions of Arc GPUs into the hands of PC gamers every year.” Later, Intel’s CEO Pat Gelsinger chimed in, tweeting, “I am with you, too @pcgamer. We are on it.”

Promising words, no doubt, and it’s also significant that Pat replied himself. It’s quite rare for the CEO of a company as large as Intel to be on Twitter replying directly to people, but we assume Raja tipped him off. Still, it’s encouraging that Intel is listening and aware of how they can “disrupt” an industry that has only gotten worse as the pandemic has continued.

There’s a problem with the idea that Intel can fix this, however. Intel is using TSMC’s 6nm process for its GPU fabrication duties. TSMC’s foundries are already pushed to the limit and Intel’s launch may not increase the absolute number of GPUs being built worldwide every month.

Regardless, PCGamer is right that Intel has a huge opportunity in front of it right now, with an industry literally begging for their products. One remaining unknown is how Intel will position and price its GPUs. Previously the company had stated that its Arc lineup would be launching in Q1 2022. It even printed that date on slides shown at CES a few weeks ago. Then the company quietly scrubbed all references to “Q1” for the Arc launch on its website. That’s not a good sign. However, like many gamers, we remain hopeful that Intel has something decent in the pipeline. The company has stated it wants to compete in the “high performance” category as opposed to just releasing a budget card, and lord knows there’s pent up demand for a graphics card that can run 1440p at high frame rates with ray tracing. According to Intel’s previous teaser video, that’s the company’s performance target. Will Intel deliver? Nobody knows the answer to that question, but the company is clearly aware of the unusual moment it finds itself in.



Sold by Amazon Shut Down After Washington State Investigation

(Photo: Christian Wiediger/Unsplash)
Washington state’s Office of the Attorney General has announced that Amazon will be forced to shut down its Sold by Amazon (SBA) program. The decree comes nearly two years after Attorney General Bob Ferguson launched an antitrust investigation into Amazon’s use of the program.

SBA didn’t exactly enjoy a long run. Started in 2018, the program supposedly helped eligible third-party sellers by automatically adjusting those sellers’ prices in accordance with real-time competition and demand. If a participating seller’s product was normally listed for $20 but Amazon’s algorithms found that the same product was selling for $15 on other websites, Amazon would lower the price to help the product sell. Amazon would then issue the seller their minimum gross proceed (MGP) to ensure the seller still received a payout that would ultimately support their business. 

Proponents of SBA thought the program would please customers and sellers alike, with savings for the former and increased opportunity to fulfill a potential sale for the latter. After all, the ability for sellers to pre-select an MGP would prevent them from facing significant financial loss. Skeptics, however, worried the program would be wielded as a loss leader to protect Amazon’s status as a default option for shoppers. Washington Attorney General Bob Ferguson asserts this is exactly what happened.

(Photo: Andrew Stickelman/Unsplash)

According to the lawsuit Ferguson filed Wednesday, Amazon exclusively extended program invitations to third-party sellers already “responsible for the vast majority of worldwide sales” on the Amazon website. By entering into an agreement that guaranteed the seller would receive their MGP, Amazon prevented those sellers from setting their own prices (or even enacting temporary discounts), which in turn kept those third parties from competing with Amazon itself. Ferguson’s investigation found that this also led customers to purchase Amazon’s private label products, benefitting Amazon while cutting away potential customers for the very third-party sellers the program claimed to support. 

“SBA contracts were entered for the purpose of, and did, artificially raise, stabilize, and maintain online retail prices to consumers within Washington State and the United States,” the lawsuit reads. Ferguson’s filing goes on to describe SBA agreements as “injurious to the public interest” and having “unreasonably restrained trade or e-commerce,” as they inherently involved price-fixing between horizontal competitors.

The lawsuit is joined by a consent decree indicating a settlement between the Office of the Attorney General and Amazon: Amazon be required to halt the SBA program and pay a $2.25 million fine. The decree also forbids Amazon from ever creating a “new,” differently-named program with the SBA’s terms and conditions.

Fulfilling the terms outlined in the consent decree won’t be particularly difficult, given that Amazon suspended its use of SBA in June 2020, just three months after Ferguson announced his office would be investigating the program. The company says it pumped the brakes on SBA for reasons unrelated to legal scrutiny, though it hasn’t really said what the real reason was, either. Not that it matters—SBA won’t be making a comeback regardless.

Now Read:



EVGA Unveils Outrageous Open Air PC with Carbon Fiber Chassis

(Photo: EVGA)
PC component and accessories maker EVGA has pulled the wraps off its all-new E1 open air PC “concept” design, and the “sports car adjacent” gaming rig is every bit as enticing as a Formula 1 prototype. Though EVGA isn’t known for producing entire PCs, much less exotic ATX chassis, the E1 is like nothing we’ve ever seen. We say that not because of its open air design, which is nothing new, but because of its extensive use of carbon fiber as well as the fact that instead of mounting the motherboard to the case like most cases, the E1 uses steel cables to suspend the motherboard in the center of the contraption. Some might call it a PC, others might call it a work of art. All we know is EVGA isn’t mentioning price just yet, which tells us it’s one of those, “if you have to ask,” scenarios. It’s also only available in limited quantities, and to EVGA members only.

Naturally, the E1 is loaded with EVGA’s most high-end components, including several from its Kingpin line. Kingpin is EVGA’s brand for flagship products designed for extreme overclocking. But starting with the chassis, it’s a 100 percent 3k plain weave carbon fiber, which means there are 3,000 filaments of carbon fiber per “tow,” which is the name for the fiber with all the strands twisted together. This allows the frame to be both extremely light, coming in at just 2.76lbs, while also being very strong. EVGA claims it’s the lightest chassis in its class, by volume. The motherboard is hung from the chassis using steel cables, which is a design the company calls “independent suspension,” which it says is patent pending.

The E1’s carbon chassis is feather weight at just under 3lbs. (Image: EVGA)

At the heart of the system is the company’s soon-to-be-released Z690 Dark Kingpin motherboard, which is its flagship board for Intel’s latest Alder Lake CPUs. It’s designed for overclocking, and includes a 21-phase VRM as well as a 10-layer PCB, as well as every bell and whistle you’d expect on such a board, including dual 2.5Gb/s Intel NICs. EVGA isn’t saying which CPU it is using, but it’s a safe bet it will be an overclocked Core i9-12900K.

Bolted to the board is the also elusive Kingpin GPU, which isn’t identified but will probably be an RTX 3090 Ti, and it features a hybrid design like previous Kingpin GPUs. That means it’ll be connected to the system’s 7th gen closed-loop liquid cooling system, which is comprised of dual 360mm radiators — one in the front of the chassis, and one on top. The water block on top of the CPU features an LCD that displays temperatures for the CPU and coolant, as well as fan and pump speeds. What’s really interesting is the cooling system appears to be connected to a row of analog gauges that sit at the bottom of the chassis, just above the I/O ports.

 

The unique analog gauges of the E1 gaming rig. (Image: EVGA)

Powering the whole thing is an EVGA 1600W Titanium power supply with “limited” 3K carbon fiber finish, so it’s likely exclusive to this PC and won’t be available for purchase separately. Rounding out the whole package is a carbon fiber keychain, which looks like just a simple fob to us, and begs the question: why don’t you need a key to start the PC?!

Since the E1 is a concept, and the video posted above looks like it’s generated from 3D renders, there’s no telling how close a system like this to reality. EVGA included photos of the carbon fiber frame that look 100 percent authentic, as opposed to renders, so perhaps it has a prototype ready and is fine-tuning its performance. Either way, just a single component in this system is going to be at the very high end of the pricing stratosphere, so when you combine the motherboard, CPU, cooling, PSU, PSU, chassis, and the rest, you’re probably somewhere in the $5,000 ball park, but don’t quote us on that. There’s no word on availability either, but since the motherboard and GPU aren’t even available yet, it’ll probably be a few months from now.

Now Read:

 



All Is Not Well With NASA’s Lucy Spacecraft (Updated)

NASA’s Lucy spacecraft is having difficulties with its solar panels. The spacecraft launched on Oct. 16th without incident, and successfully unfolded both its solar panels. But only one of its panels successfully latched into position.

Telemetry via NASA’s Deep Space Network shows that Lucy as a whole is still safe. All other systems are normal. Nevertheless, mission engineers are on the problem.

Initially, NASA acknowledged the issue in a brief blog post, saying, “In the current spacecraft attitude, Lucy can continue to operate with no threat to its health and safety. The team is analyzing spacecraft data to understand the situation and determine next steps to achieve full deployment of the solar array.”

Update (1/30/2022): According to NASA, even without power generation at 100%, Lucy is sailing along in “outbound cruise” mode. The team thinks that a lanyard, which was supposed to pull Lucy’s solar array to full spread, may have lost tension and come unspooled. At a meeting of NASA’s Small Bodies Assessment Group on January 25th, Hal Levison, principal investigator for Lucy, explained that there’s no great rush to find a solution. “We have plenty of time because we’re not scheduled to fire the main engine for a while,” he said. “We’re taking our time to carefully go through our options.”

The current plan is to make an attempt to reel in that lanyard and latch down the solar array “in the late April timeframe”; however, the Lucy mission team is considering leaving it as-is. In the meantime, they’re testing a “dual motor solar array deployment,” which would use both the primary and backup motor. It’s possible that engaging both motors at the same time could apply enough force to get the troubled solar array fully unfolded and latched down.

Power moves

NASA has had to repair damaged and power-starved spacecraft before. They repaired Skylab while it was in orbit, by freeing a frozen hinge on its damaged solar array. The agency successfully contended with a stuck solar panel on the ISS back in 2006. In 2021, they strategically dusted the solar panels on the Insight lander to restore power.

While the spacecraft itself is not in danger, getting its panels back into mission spec is important. Earth orbits at 1 AU, and the asteroids Lucy will visit all orbit at twice that distance or more. At 2 AU+, the spacecraft will receive perhaps three percent of the power it would receive if it were in Earth’s orbit around the sun.

Long-term orbital diagram for NASA's Lucy spacecraft. Image: SwRI

Lucy’s orbital track, long-term. Credit: Southwest Research Institute

Lucy’s science payload is powered by the solar panels, so its mission could be hampered if those instruments can’t find the power to run. But the spacecraft’s transit to the asteroid belt is controlled by other means. Fourteen hydrazine thrusters, made by Aerojet Rocketdyne, are responsible for propulsion and course corrections.

The fix for Lucy may be as simple as folding the panels and unfolding them. But Skylab was in low-earth orbit, so we had the option to send humans to fix it. We won’t have that option with Lucy, so hopefully the legendary ingenuity of NASA’s earthbound engineers will come through again.

Diamonds, indeed

NASA named Lucy after the fossilized skeleton of a hominin who changed our understanding of human evolution. That hominin was in turn named after the iconic Beatles song, which they were blasting on repeat in base camp that night, because reasons. But there are more diamonds in Lucy’s story.

Scientific instruments aboard the spacecraft include a thermal emission spectrometer titled L’TES, and a tool called L’Ralph that includes a color camera and an IR spectrometer. And in company with the black-and-white L’LORRI camera (wow, does this roster sound vaguely Vulcan), Lucy carries for one of its instruments a disc of pure, flawless lab-grown diamond. While multiple sources refer to this information, we still haven’t found an explanation of which piece of equipment will use the diamond disc or how it contributes to the mission.

Lucy’s mission is to visit the “fossils” of planet formation. This includes the main-belt asteroid 52246 Donaldjohnson, which was named for the discoverer of the Lucy fossil. The spacecraft will also visit several asteroids that orbit within the gravitational eddies at the Jupiter-Sun L4 and L5 points: the “Greek camp” and “Trojan camp” respectively. These eddies exist because they sit at points equidistant from the sun and Jupiter. While the Lagrange points don’t themselves have any gravitational pull, they are relatively stable with respect to the two-body system. Inertia lets orbiting bodies cluster there, rather than being drawn off into another path.

Assuming all goes well, Lucy will reach Donaldjohnson in April 2025.

Feature image credit: NASA’s Goddard Space Flight Center/Conceptual Image Lab/Adriana Gutierrez

Now Read:



Malware Masquerading as Android 2FA App Infected 10,000 Phones Before Removal

There are almost three and a half million apps in the Play Store, and despite Google’s alleged best efforts, malware still slips through every now and then. One recently removed app was particularly malicious, masquerading as a two-factor code manager. Known simply as 2FA Authenticator, the app picked up more than 10,000 installs until security researchers identified it as a vehicle for trojan-dropper malware

We (and others) are always recommending two-factor authentication (2FA) as one of the best ways to secure your online accounts. Some services rely on one-time SMS codes, but 2FA apps like Google Authenticator and Authy make it easier to manage multiple 2FA tokens. 2FA Authenticator was actually a functional 2FA app — it used the open source Aegis authentication application as a base, but under that it contained the Vultur malware. 

The genius of this malware is that it looked legit. You wanted a 2FA app, well this one did the job. 10,000+ people decided to give 2FA Authenticator a shot over more established names. However, the app would abuse Android permissions to copy your app list, location, and other personal information. Vultur is banking malware, which aims to steal credentials and financial information. It would also attempt to disable the lock screen and download third-party apps by pretending they are app updates. This behavior is suspicious for anyone who’s familiar with how Android works, but that’s not most people. 

Before its removal, 2FA Authenticator picked up more than 10,000 downloads.

Perhaps the most troubling innovation in this piece of malware is an implementation of the VNC screen-sharing application. You can probably see where this is going. When people put their 2FA keys into this app, the attacker could watch in real-time to swipe banking details and two-factor codes. 

The app was live for about two weeks, reports Ars Technica, longer than most malware that slips into the Play Store. It’s likely that the app’s functional 2FA capabilities helped it fly under the radar for a period of time. Google has tools to remotely nuke apps downloaded from the Play Store, but it’s unclear if it has done so in this case. Anyone who worries they downloaded the app should factory reset the phone and start changing passwords. The package name is “com.privacy.account.safetyapp.” If you see that in your app settings, it’s time to panic.

Now Read:



Friday, 28 January 2022

4K vs. UHD: What’s the Difference?

With 4K TVs having become mainstream, let’s look at two terms that have become conflated with one another: 4K and UHD, or Ultra HD. TV makers, broadcasters, and tech blogs are using them interchangeably, but they didn’t start as the same thing, and technically still aren’t. From a viewer standpoint, there isn’t a huge difference between 4K vs UHD. 4K is a more popular term than UHD, but 4K-capable Blu-ray drives are marketed as Ultra HD drives.

4K vs. UHD

In simple terms: 4K is a professional production and cinema standard. UHD is a consumer display and broadcast standard. To discover how they became so confused, let’s look at the history of the two terms.

The term “4K” originally derives from the Digital Cinema Initiatives (DCI). This consortium standardized a spec for the production and digital projection of 4K content. In this case, 4K is 4,096 by 2,160 and is exactly 4x the previous standard for digital editing and projection (2K, or 2,048 by 1,080). 4K refers to the fact that the horizontal pixel count (4,096) is roughly four thousand. The 4K standard is not just a resolution, either: It also defines how 4K content is encoded. A DCI 4K stream is compressed using JPEG2000, can have a bitrate of up to 250Mbps, and employs 12-bit 4:4:4 color depth. (See: How digital technology is reinventing cinema.)

The next step up from HD is UHD, Ultra High Definition. It’s the official name for the display resolution of 1,920 by 1,080. UHD quadruples that resolution to 3,840 by 2,160. It’s not the same as the 4K resolution made above. Despite this, almost every TV or monitor you see advertised as 4K is actually UHD. There are some panels that are 4,096 by 2,160, which adds up to an aspect ratio of 1.9:1. But the vast majority are 3,840 by 2,160, for a 1.78:1 aspect ratio.

A diagram illustrating the relative image size of 4K vs. 1080p — except that 4K should be labeled UHD or 2160p.

Why Not 2160p?

Now, it’s not as if TV manufacturers aren’t aware of the differences between 4K and UHD. But presumably for marketing reasons, they seem to be sticking with 4K. So as to not conflict with the DCI’s actual 4K standard, some TV makers seem to be using the phrase “4K UHD,” though some are just using “4K.”

The UHD standard is actually two standards. There’s 3,840 by 2,160, and then there’s a big step up, to 7,680 by 4,320, also called UHD. It’s reasonable to refer to these two UHD variants as 4K UHD and 8K UHD. The 8K UHD spec should probably be renamed QUHD (Quad Ultra HD). (Read: 8K UHDTV: How do you send a 48Gbps TV signal over terrestrial airwaves?)

The real solution would have been to abandon the 4K moniker entirely and instead use the designation 2160p. Display and broadcast resolutions have always referred to resolution in terms of horizontal lines. The letters “i” and “p” refer to interlacing, which skips every other line, and progressive scan, which doesn’t. Common standards are 576i (PAL), 480i (NTSC), 576p (DVD), 720p, 1080i, 1080p, and so on.

The reason this didn’t happen is that the number didn’t match the size of the resolution increase. “2160p” implies that the resolution is double that of 1080p HD, while the actual increase is a factor of 4. The gap between 720p and 1080p is significantly smaller than the gap between 4K and 1080p, though how much you notice the upgrade will depend on the quality of your TV and where you sit.

Further complicating matters, there’s the fact that just because a display has a 2160p vertical resolution doesn’t mean it supports a 3,840 or 4,096-pixel horizontal width. You’re only likely to see 2160p listed as a monitor resolution, if at all. Newegg lists three displays as supporting 4K explicitly as opposed to UHD (4,096 by 2,160), but they’ll cost you. These sorts of displays target professionals. Now that there are 4K TVs everywhere, it would take a concerted effort from at least one big TV manufacturer to right the ship and abandon the use of 4K in favor of UHD. In all honesty, though, it’s too late. The branding ship has sailed.

YouTube is a bit of an exception to this. YouTube labels video as both UHD and 2160p. The labels above are an attempt to be both technically accurate (2160p50 means 2160p at 50 frames per second) while still telling regular viewers that this is a 4K stream.

When people use 4K and UHD in regular conversation they mostly refer to the same thing. It’s a distinction that matters more in technical contexts, like the ones discussed above.

 

Sebastian Anthony wrote the original version of this article. It has since been updated several times with new information.

Now Read:



RGB Keyboard Support Suggests Gaming Chromebooks Are Coming Soon

Chromebooks started as low-cost machines that relied on the cloud to do everything. Little by little, more powerful processors and faster storage have made their way into these machines, and now Chromebook makers may be gearing up to give them more gaming cred with RGB keyboards. There’s only one reason to do this: Steam-enabled gaming Chromebooks are on the horizon. 

Google has been talking about bringing Steam to Chrome OS for the last few years, but the so-called “Borealis” project has yet to bear fruit. However, the groundwork is there. Google controls Chromebook platform hardware much more stringently than it does on the Android side. Some recent boards like Volteer have enough power via Intel’s Iris Xe Graphics to play some simple games and even AAA titles on low settings. 

The newest wrinkle is alleged support for RGB keyboards in Chrome. RGB has become synonymous with gaming — almost every gaming component and peripheral these days has enough LEDs to be confused with a flying saucer. According to 9to5Google, the open source Chromium project recently added a flag for RGB keyboards. When enabled, the feature will allow gamers to change the color of each key and cycle through multiple effects. That means there will probably be a GUI of some sort, but it’s not available in any public versions of Chrome OS yet. 

541197-steam-logo

The code points to several RGB Chromebooks, two of which are codenamed Vell and Taniks, based on the 12th Gen Intel Alder Lake CPUs. 9t05 speculates these are HP and Lenovo gaming laptops, respectively. There may also be some support for detachable RGB keyboards. A device called Ripple in the open source code appears to be removable — it might be an accessory or simply the keyboard for a 2-in-1 device. 

It’s not a sure thing Steam will launch on these RGB-equipped Chromebooks, but it seems like a safe bet. When Steam does finally launch, it’s not going to be the same experience as it is on Windows. Chrome OS is essentially Linux, and many popular games are Windows-only. Still, it’ll be better than playing Android games on your Chromebook. Currently, about 80% of the top 100 Steam games run on Linux, thanks no doubt to Valve’s efforts to make Steam OS a thing. Now, Valve is on the verge of releasing the Steam Deck, which also runs Linux.

Now Read:



Apple Finally Allows Face ID With Masks in iOS 15.4 Beta

Back in 2017 when Apple launched the all-new iPhone X, it was heralded as a brave step forward for the company. It was the first of a new generation of devices that ditched the home button in favor of Face ID, letting people unlock their phones with a glance. Apple had done its homework. Despite peoples’ reservations about the replacing a reliable button with something new and unknown, people agreed the new technology was faster, easier, and it “just worked.” Then the pandemic arrived in 2020, and suddenly a lot of people, your humble author included, wished we could go back to having a home button, or some sort of in-display fingerprint sensor, since Face ID wasn’t designed to be compatible with a face mask. The only reliable “workarounds” were removing your mask so it could see your face (not always easy to do), typing in your passcode after Face ID failed (time consuming), or using an Apple Watch to unlock the phone (not everyone has one of those). Now it appears Apple has found a middle-ground: allowing Face ID to work while wearing a mask. There’s one big caveat though; according to 9to5mac, it only works on iPhone 12 and newer models.

The feature was trotted out with the latest iOS beta, version 15.4, and instead of scanning your entire face, the phone will scan the “unique features around the eye area to authenticate.” 9to5mac also notes that in addition to details around your eyes, it can also identify several pairs of glasses. Wearing different sets of glasses in different locations will not be a problem.

Despite adding this feature to the beta, Apple states, “Face ID is most accurate when it’s set up for full-face recognition only.” Still, if you have a newer iPhone and want to live life dangerously:

Upon installing the beta software (available only for developers), the OS will prompt you to set up Face ID with a mask. The new setup routine also lets you check a box saying you wear glasses, so it will scan your face both with and without your spectacles on, to get a complete picture.

Setting up Face ID with a mask in iOS 15.4 beta. (Image: 9to50mac)

This newest addition to its software by Apple is an admission of two things: Masks will probably be a thing for quite a while longer, and it’s definitely not bringing back the home button despite people asking for it since COVID-19 arrived in 2020. Apple is also obviously not going to be adding an in-display fingerprint reader like its biggest rivals, including Samsung and Google, much to the chagrin of many Apple aficionados. Still, if Face ID is just as effective with a mask as without, then Apple’s solution might just be a best of both worlds scenario – no expensive hardware upgrades that would increase the price of an already pricey phone, and the same functionality as it had previously with full-face ID.

According to 9to5mac, it does appear to work quite well. They write: “In our use so far, Face ID With a Mask has been totally consistent at unlocking iPhone and as fast as using Face ID without a mask (quite a bit faster than using Apple Watch to unlock iPhone).” One additional benefit to the new version of Face ID is you can toggle it on and off without having to rescan your face again.

Now Read:



Frogs Regrow Limb With New Treatment

The African clawed frog. (Photo: Smithsonian’s National Zoo & Conservation Biology Institute)
Biological engineers have found a way to help African clawed frogs regrow lost limbs. According to a new study published in the journal Cell Reports, scientists applied topical drugs to amputation sites, resulting in the growth of a “leg-like limb.”  

The African clawed frog is predisposed to regeneration, able to regrow severed limbs throughout much of its youth. But this ability dwindles as the frog reaches adulthood; at a certain level of maturity, attempts to regrow limbs result in cartilaginous spikes with little practical value. Biological engineers at Tufts University, Harvard, and the University of Florida saw this shift as a valuable opportunity to learn how regeneration could be manually encouraged using topical steroids. 

The team got to work applying wearable bioreactors (in this case, small caps) to several frogs’ amputation sites. Each bioreactor contained progesterone, a naturally-occurring steroid hormone, suspended in a protein-based hydrogel. Despite only remaining on the frogs’ bodies for 24 hours, the progesterone cocktail induced “robust” bone growth, resulting in the production of “paddle-like” appendages across the span of 18 months. 

“It’s not a full limb that’s regrown, but it’s certainly a robust response,” scientists unrelated to the research have said. “It is particularly promising that only a daylong treatment can have such a positive effect on an adult animal.”

(Image: Celia Herrera-Rincon et. al)

Though the frogs’ new limbs can’t quite be considered legs, they’re certainly more leg-like than the spikes they would have otherwise developed. The study indicates that frogs that didn’t receive progesterone treatment produced growth mainly made up of cartilage, with any new bone mass appearing beneath the amputation plane. In contrast, the treated frogs produced appendages containing “complex, patterned structures” made up of non-ossified and weakly ossified bone. The scientists also found that the administered progesterone remained local to the amputation site instead of dissipating among the frog’s system, thus preventing any impact on the rest of the frog’s physiology. 

Beyond being impressive in their own right, the study’s results show promise for the future of regeneration research as it relates to amputated limbs. Its authors note that around 2 million Americans have experienced limb amputations. Lost limbs are currently replaced with prosthetics (if anything at all), which are incredibly expensive and must be fitted to the patient’s exact measurements.  It will be challenging to reproduce the frogs’ exciting results in humans or most other mammals, though; after all, African clawed frogs already have regenerative qualities, a privilege we humans unfortunately don’t share. 

Now Read:



US Military Awards $60 Million Contract for Supersonic Passenger Plane

Supersonic air travel has been off the table for almost 20 years, but it could be making a comeback thanks to a US military contract. The Air Force has just awarded a $60 million grant to Colorado-based Boom Supersonic to support its airliner development. When complete, the company’s Overture aircraft could carry dozens of passengers at speeds up to Mach 1.7. 

The three-year contract under the military’s Strategic Funding Increase (or STRATFI) program will help Boom turn its engineering concept into reality. Ars Technica reports the company recently tapped Piedmont Triad International Airport in Greensboro, North Carolina as the manufacturing site of its first full-scale Overture aircraft. Currently, Boom hopes to begin construction in 2024, with the first airplane hitting the runway in 2025. However, the first flight is not expected until 2026, and it might take until the end of the decade before passengers get to board an Overture. 

This won’t be a quantity over quality project for Boom. The 65-88 passengers aboard a hypothetical Overture would get first-class experience with direct aisle access and ample leg room. Concept renders also show huge windows and equally expansive integrated screens at each seat. Boom further claims the vehicle will be 100 percent carbon-neutral once it’s up and running thanks to sustainable fuels. 

You may be wondering why the US military is interested in supporting the development of a passenger airliner when it already has supersonic aircraft. You need only look at what Boom wants to build: a 205-foot aircraft that flies at 1,300 miles per hour with a range of almost 5,000 miles. 

The military regularly buys civilian vehicles for transportation and logistics, and the Overture could be of great use. After all, at the end of the day it’s a supersonic airliner capable of delivering whatever you want anyplace in the world within eight hours. “With STRATFI, we’re able to collaborate with the Air Force on the unique requirements and needs for global military missions, ultimately allowing Boom to better satisfy the needs of the Air Force where it uses commercially derived aircraft,” CEO Blake Scholl said in a press release. 

Travelers last had the option of taking a supersonic flight in 2003 when the Concorde was retired. British Airways and Air France began operating the planes in 1976, but high costs and a tragic crash in 2000 led to its grounding. The Overture could eventually fill that niche, but don’t start packing your bags just yet.

Now Read:



This ‘Minimum Viable Computer’ Could Cost Just $15

Computers used to be luxury devices that only the wealthy could afford, but now you can carry a phone in your pocket that’s many times more powerful than the computers that sent men to the moon. However, even the cheapest phones are still $50-100 thanks to the cost of licensing and cellular components. Developer Brian Benchoff wanted to see just how cheap a functional computer could be. He came up with the Minimum Viable Computer, a pocket-sized Linux box that could cost as little as $15

Depending on what you expect a computer to have in order to be “viable,” you might be pleasantly surprised or completely uninterested in the MVP. It uses a simple two-layer PCB, integrated with an Allwinner F1C100s system-on-a-chip. Its single CPU core is clocked at a mere 533MHz, but it does have support for running modern versions of Linux. Don’t expect a GUI, though. This is a purely command line affair, as envisioned by Benchoff. It can run scripts, ping remote servers, and power a variety of USB devices. Also, there’s a physical keyboard. 

The device features a split five-row orthogonal keyboard with a small 2.3-inch display in the middle. The screen has a resolution of 240 x 320, and it does not support touch. Can it run Crysis? No, but it does run Doom, which is bundled with the embedded Buildroot Linux OS. This is one of the many decisions made in order to keep the Minimum Viable Computer as cheap as possible. Another necessary concession is the battery. Shipping lithium-ion cells requires you to deal with additional regulatory and logistical hurdles, so Benchoff opted to go with a AAA NiMH cell. 

The board doesn’t have any wireless radios, but there is a standard USB-A port for peripherals. You can plug in a Wi-Fi adapter, a keyboard, external storage, and anything else the lsusb utility supports. However, to charge the device, you’ll have to use a separate USB-C port (that’s power only, no data). There’s also a microSD card slot for storage. 

After adding up the bill of materials, Benchoff found the MVP would cost about $14.16, with the single largest expenditure being the PCB for $2. There is one catch, though. That price assumes you’re buying at least 10,000 of each component. Bulk purchasing is the only way to get electronic components this cheap, but that won’t be a problem if people express interest. Benchoff says he intends to make this project a reality, and anyone who wants to be included should reach out on Twitter.

Now read:



Thursday, 27 January 2022

Oculus Quest Officially Rebranded to Meta Quest, and not Everyone is Happy About it

As of today, Facebook slash Meta has made another step towards jettisoning its baggage of yore and rebranding itself completely into a Metaverse-first company. Though Facebook is officially called Meta now, and has been for several months, we were waiting to see what the company would do with its Oculus brand of Virtual Reality (VR) headsets, and now we have the answer: all forthcoming VR products will come with the new branding of Meta Quest.

The company announced the change in a tweet that stated, “New name. Same mission.” The Twitter account that made the tweet, @oculus, no longer exists, and has been transferred to @metaquestvr, as noted by XDA-Developers. The virtual reality division of Meta will be known as Meta Quest too, so as of now the only part of Oculus that still remains are headsets that haven’t been sold. It’s safe to assume once that stock is liquidated, all the new Quest headsets will receive the new branding. Obviously this is just a name change, so it’s not indicative of any actual hardware or roadmap changes for the company. It’s just an alignment with the parent company’s naming scheme.

Meta’s tweet announcing the news. (Image: @metaquestvr)

Not surprisingly, the news of the rebrand isn’t going very well, which follows the pattern of when the company changed its name to Meta back in October 2021. People didn’t like the name Meta, and the video the company released to show off the Metaverse was also met with widespread mockery. This time around, the finternet is annoyed by the rebrand, which has been documented by Android Central, with the majority of responses on Twitter being fairly savage. Not helping things is the “hey, we’re hip” tagline on the rebranded Twitter account that reads, “New handle, who dis?”

(Image: Paramount Pictures / Android Central)

Meta took note of the disgruntled responses, and replied, “We understand our community will miss the Oculus name, but change doesn’t always have to be a bad thing! We’re hoping to make our ambitions to help build the metaverse more clear with our new name!”

When that wasn’t enough to silence the naysayers, it responded to one person by stating, “We hear you. We all have a strong attachment to the Oculus brand, and this was a very difficult move to make. While we’re changing the brand of the hardware, Oculus will continue to be a core part of our DNA and will live on in things like software and developer tools.”

It seems that despite peoples’ general loathing of Facebook, the Oculus brand was still held in somewhat high regard as it was separate from Facebook, and people generally agree the Quest 2 is a very good VR headset. Changing its name to Meta Quest 2 or whatever it will be called robs the company of all the goodwill it’s built up over the years, while transferring everyone’s disdain onto a product that really doesn’t deserve it. Plus, as several people on Twitter pointed out, Oculus sounds cool and nobody likes the Metaverse, so why tarnish your headset with such a label? Besides, Facebook and Meta have plenty of subsidiaries that will never be rebranded, including Instagram, WhatsApp. This lead some on Twitter to speculate what it would sound like if Meta did rebrand these properties, with one Twitter user cheekily guessing Instagram would be changed to Meta Scroll.

Now Read:



Scientists Have Found a Magnetar In Our Cosmic Backyard

This image shows a new view of the Milky Way from the Murchison Widefield Array, with the lowest frequencies in red, middle frequencies in green, and the highest frequencies in blue. The star icon shows the position of the mysterious repeating transient. Credit: Dr Natasha Hurley-Walker (ICRAR/Curtin) and the GLEAM Team

Astronomers in the Australian outback have discovered something new. Three times an hour, it becomes one of the brightest objects in the sky. The team that discovered it thinks it’s a magnetar — and it’s right in our cosmic backyard.

As the mysterious object rotates, highly polarized or twisted beams of radiation shoot from its poles. Every 18.18 minutes, for 30 to 60 seconds, a beam crosses our line of sight, and the object starts to flash.  “It was kind of spooky for an astronomer because there’s nothing known in the sky that does that,” said team leader Dr. Natasha Hurley-Walker in a statement. Slow transients, like a supernova, might happen on a scale of days to months. Faster ones like pulsars flash on and off within milliseconds. “It’s just every 18.18 minutes, like clockwork,” she said.

Because of this strangely long interval, Dr. Hurley-Walker said the observations match with predictions of an exotic astrophysical object called an ultra-long period magnetar. “But nobody expected to directly detect one like this because we didn’t expect them to be so bright. Somehow it’s converting magnetic energy to radio waves much more effectively than anything we’ve seen before.”

“Because we didn’t expect this kind of radio emission to be possible,” said Dr. Hurley-Walker, “the fact that it exists tells us that some kind of extreme physical processes must be happening,” she said.

This image shows the Milky Way as viewed from Earth. The star icon shows the position of the mysterious repeating transient. Credit: Dr Natasha Hurley-Walker (ICRAR/Curtin).

Dr Hurley-Walker is now monitoring the object with the Murchison Widefield Array (MWA) in Western Australia to see if it flashes the porch light again. “If it does,” she said, “there are telescopes across the Southern Hemisphere and even in orbit that can point straight to it.”

Squish upon a star

Neutron stars rise from the ashes of a core-collapse supernova. Starquakes rock their crusts while incomprehensible levels of force crush the core of a star into a tiny ball. But if a neutron star is rotating fast enough when it’s born, it can develop an internal dynamo effect that supercharges its magnetic field. The beautiful and terrible result is called a magnetar.

These monstrous neutron stars cram more than the mass of the Sun into a sphere about twenty kilometers across. But even at fifty times that distance from a magnetar, the devastating magnetic fields are incompatible with the chemistry of all known life.

Within a magnetic field of magnetar strength, physics as we know it becomes something… different. “X-ray photons readily split in two or merge together. The vacuum itself is polarized, becoming strongly birefringent, like a calcite crystal. Atoms are deformed into long cylinders thinner than the quantum-relativistic wavelength of an electron.”

The Milky Way is somewhere between one and two hundred thousand light-years across. At four thousand light-years away, this magnetar is in our cosmic backyard. It’s extremely powerful, and uncomfortably close. Happily, it won’t last long. After just ten thousand years or so, a magnetar’s magnetic field decays. After that, its outbursts will cease.

Dark skies, deep time

Curtin University student Tyrone O’Doherty was using a method of his own design when he spotted the new magnetar on the block in data from the MWA telescope. “It’s exciting that the source I identified last year has turned out to be such a peculiar object,” said O’Doherty, now studying for a PhD.

The MWA is a low-frequency radio telescope consisting of thousands of spider-legged dipole antennas. To do its work, it uses the dark skies and radio silence of the Australian outback. MWA Director Professor Steven Tingay said the telescope is a precursor instrument for the Square Kilometre Array (SKA).

There's a magnetar in our cosmic backyard, only a few thousand light-years from Earth, and it was found using the Murchison Widefield Array. Shown here is Tile 107, or “the Outlier” as it is known: one of 256 tiles of the MWA, located 1.5km from the core of the telescope. The MWA is a precursor instrument to the SKA.

Tile 107, or “the Outlier” as it is known, is one of 256 tiles of the MWA located 1.5km from the core of the telescope. The MWA is a precursor instrument to the SKA. Photographed by Pete Wheeler, ICRAR

“Key to finding this object, and studying its detailed properties, is the fact that we have been able to collect and store all the data the MWA produces for almost the last decade at the Pawsey Research Supercomputing Centre. Being able to look back through such a massive dataset when you find an object is pretty unique in astronomy,” he said.

Once operational, the SKA will encompass a vast network of telescopes throughout Australia and South Africa. Its incredible resolution will allow researchers to create a three-dimensional map of galaxies out to the edge of the visible universe, and to look back to the time of First Light. The array is expected to make its first observations in 2027.

Now Read:



Meta is Building a Massive New Supercomputer

(Photo: Meta)
Meta, formerly known as Facebook, is working on what it says will be the fastest AI supercomputer in the world with the goal of advancing into a new generation of AI. Known as the AI Research SuperCluster (RSC), it’s already among the fastest AI supercomputers in operation. 

Meta’s goal with the supercomputer is to use AI to power real-time interactions, such as the impressive feat of helping “large groups of people, each speaking a different language… seamlessly collaborate on a research project or play an AR game together.” Speech recognition, computer vision, and neuro-linguistic programming are among the tech giant’s top priorities. But given that this is Meta-slash-Facebook, the metaverse can’t not make an appearance on the RSC’s press release; the company admits that it will use its powerful new supercomputer to build AI-driven applications and products that ultimately support the virtual world.

This means the supercomputer has to be extremely reliable, especially because Meta imagines it will run experiments involving thousands of GPUs for weeks at a time. The user interface has to be decent, too, if the company wants to facilitate the groundbreaking research and engineering it says it’s aiming for. Such ease of use and dependability (plus the obvious computing power) relies on a pretty notable build, particularly for a project that started off as fully remote.

(Infographic: Meta)

Like with any other AI supercomputer, Meta is assembling RSC by combining multiple GPUs into compute nodes, then connecting them with a high-performance network fabric that allows for ultra-efficient communication. At its current build phase, RSC’s compute nodes are made up of 760 of Nvidia’s DGX A100 systems, which interact through an Nvidia Quantum 1600 Gb/s InfiniBand two-level Clos fabric. (The end product will connect 16,000 GPUs.) Its storage tier boasts 175 petabytes of Pure Storage FlashArray, 46 petabytes of cache storage from Penguin Computing Altus systems, and 10 petabytes of Pure Storage FlashBlade.

Meta estimates it will be finished building out the supercomputer around the middle of this year. The project’s location is under wraps for now; from a cyber standpoint, Meta is enforcing end-to-end encryption across its entire data path and requiring that all information goes through a privacy review process to confirm it has been correctly anonymized prior to use.

Now Read:



Nvidia’s RTX 3050 Launches to Modest Praise, Usual Caveats

(Image: PCMag.com)
Nvidia’s “budget friendly” Ampere GPU has finally arrived in the form of the RTX 3050. It’s intended to deliver both ray tracing and DLSS support to the mass market. The swan song for Ampere will officially go on sale January 27th, and reviews went live today that show the card offers great 1080p performance for its $249 MSRP. Unfortunately that’s not the price it will be sold for, assuming you can even find one in stock.

For a refresher the RTX 3050 is essentially a cut-down 3060, and uses the same GA106 die from that $329 GPU. Despite being a watered down version of its big brother, it still has decent specs for an entry level GPU. It isn’t castrated like the Radeon RX 6500 XT, which launched last week to online mockery. The RTX 3050 has 8GB of VRAM, compared to 4GB on the Radeon, and it comes with a PCIe 4.0 x16 interface, compared to just an x4 link on the AMD GPU. Its has a 128-bit wide memory bus too, compared to the skinny 64-bit bus on the 6500XT, though this last is somewhat mitigated by the 16MB of L3 cache on the RDNA2 GPU. The RTX 3050’s specs are unsurprising; it’s a standard 1080p high settings GPU, with the added benefit of being able to theoretically use Nvidia’s ray tracing and DLSS in games that support it.

(Image: PCMag.com)

Overall, the card mostly delivers on its promise of offering 60fps at Ultra settings in AAA titles. Unlike the Radeon card, which is held back by its lack of VRAM and narrow memory bus, the RTX 3050 can run the latest games at maximum detail, either near 60fps or beyond, depending on the game in question. PCMag.com tested the card using an Intel Core i9-10900K CPU, which notably has a PCIe 3.0 interface, but found it made little difference. This was a sticking point with the Radeon card, as its x4 interface hindered its performance on PCIe 3.0 by up to 36 percent. In their gauntlet of AAA titles the card was able to soar beyond 60fps in every title except two: Assassin’s Creed Odyssey and Red Dead Redemption 2, where it scored 54fps and 49fps, respectively. That’s still pretty decent, assuming once again, that this is a $250 GPU. This is not the GPU for 1080p gaming at high refresh rates however, unless you play older games from several years ago (no judgement) or games that aren’t too graphically demanding like Fortnite.

Generally speaking, the RTX 3050 performs about the same as the Nvidia GTX 1660 Ti, which came out two years ago. The big difference this time around is that the GTX 1660 Ti can’t do ray tracing or DLSS. The RTX 3050 can. In PCMag’s testing using the synthetic Port Royal benchmark, the 3050 scored 3,565 compared to 548 for the Radeon RX 6500 XT, and a big fat goose egg for the GTX 1660 Ti. PC Gamer also noted that the 3050 offers “playable ray tracing” on games that offer both ray tracing and DLSS, with the latter required to help boost frame rates to acceptable levels.

All in all, this is some pretty good news for folks stranded with older GPUs. But here’s the bad news, which you already knew was coming. As has been the case for every single GPU launch in the past two years, nobody knows if the cards will be in stock anywhere, or what their actual price will be. Tom’s Hardware posted a listing of all the RTX 3050s that will be available from Nvidia’s partners, and some of them are launching with ludicrous pricing built-in. As an example: Though Asus is offering a bargain bin $249 version, its overclocked versions will be offered at an insane $439 and $489. Gigabyte and MSI are also charging an exorbitant $379 for their overclocked versions as well.

(Image: Tom’s Hardware)

This leaves us with the usual caveats pointed out in the headline, and all reviews online. If you can find one for MSRP or close to it, don’t even hesitate because it’s likely if you take five minutes to mull it over it’ll be out of stock by the time you try to add it to your cart. That’s assuming you can even find one in stock. We shall find out what the situation is on the 27th, but given the card’s 8GB of RAM and relatively low price, it’ll likely be snatched up by miners rather quickly.

Now Read:

 



Report: Microsoft Will Keep Call of Duty on PlayStation for Now

Microsoft is on something of a buying spree. After picking up game publisher ZeniMax in 2021, Microsoft is now working toward a massive $68.7 billion takeover of Activision Blizzard. The publisher’s games will no doubt beef up Microsoft’s Xbox Game Pass service, but Sony won’t be left entirely in the cold. A new report says that Microsoft will continue releasing Call of Duty games on PlayStation… at least for now. 

Currently, Microsoft expects to complete the Activision Blizzard acquisition in its 2023 fiscal year, ending in mid-2023. Microsoft could also seal the deal earlier, which would leave it in a sticky situation. According to Bloomberg, Activision Blizzard has contracts with Sony that guarantee several more Call of Duty games on its platform. These titles are some of the most popular (and profitable) in the world right now, so losing access would be a major blow. 

The report claims that Activision Blizzard is using developer Infinity Ward for this year’s CoD game, and the 2023 iteration will go back to Treyarch. Both of these games will most likely come to PlayStation. Phil Spencer, Microsoft’s gaming division CEO, says that he had already spoken to Sony representatives to commit to honoring any current contracts. Bloomberg says the deal expires in 2023, so these may be the last mainline Call of Duty games for Sony. The report also says Activision Blizzard is working on a new version of its online Battle Royale Call of Duty: Warzone that will come to PlayStation. 

If you’re curious about Microsoft’s future plans, look no further than the ZeniMax deal. When that transaction was announced, Microsoft was happy to maintain all current agreements for timed exclusives on Sony’s platform. However, Microsoft changed its tune as soon as it fulfilled those commitments. The hotly anticipated Elder Scrolls VI will not come to PlayStation, remaining exclusive to Xbox and PC. Call of Duty and other Activision Blizzard titles could suffer a similar fate

We’re seeing a snowball effect in gaming just like we have in other forms of digital media. For example, Disney has spent the last decade hoovering up studios like Lucasfilm and Fox, giving it control over even more content. Almost every rights-holder has its own streaming service as well, creating silos of exclusive content. Exclusives have always been part of gaming, but the divisions may be deepening as the industry consolidates.

Now Read:



Wednesday, 26 January 2022

First Major Windows 11 Update Will Bring Android Apps, Taskbar Improvements

Microsoft released Windows 11 last year for new PCs, but the new OS is still slowly spreading to existing devices. Anyone who has taken the plunge into Microsoft’s latest version of Windows will be happy to hear a big update is coming down the pike. The taskbar is getting some much-needed improvements, and Android app support is rolling out in earnest. If you’ve been holding off on upgrading, now may be the time. 

Windows already has access to a huge amount of software, but that’s all desktop software. Both Chrome OS and macOS have added support for running mobile apps, and now Microsoft is doing the same. The promised Android app support debuted as a beta for Insiders last month, but it will be front and center in the new update as a public preview. So yes, it’s still in beta, but you’ll be able to try it out. 

Microsoft does not support the Google Play Store, but it has partnered with Amazon to get its Appstore for Android on Windows. Amazon launched the Appstore way back in 2011, but it never caught on with Android phone users despite numerous promotions and strategies. As a result, it doesn’t have nearly as much content as the Play Store, and much of what there is on offer is intended for Fire tablets. There are workaround to install the Play Store on Windows 11, but Microsoft is not officially supporting this. 

The upcoming release will also include the return of Microsoft’s taskbar weather widget. There will also be a new mute and unmute feature, and you’ll have the option to show the clock on secondary monitors. Hallelujah. If you were hoping for the return of drag and drop on the taskbar, don’t get your hopes up. Microsoft is still working on that. 

There’s one more notable taskbar improvement, and it’s extremely prescient. According to Microsoft, usage of apps like Teams, Slack, and Zoom is up more than six times during the pandemic, and no surprise with more people than ever working from home. The update next month will add a handy feature to the taskbar that lets you share your screen to select meeting clients. Microsoft calls out Teams in particular, but it should work with other programs pending developer support. 

There’s no specific date for the update yet. It will most likely roll out in waves like other Windows updates, but the first step is to upgrade from Windows 10 if you haven’t already.

Now Read: