menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

NASA Is Planning A Nuclear-Powered Trip To Mars

2 0
yesterday

Why nuclear makes sense for the Red Planet. Google’s new memory math for AI. Why video games help you sleep. All that and more in this week’s edition of The Prototype. To get it in your inbox, sign up here.

NASA has added an ambitious mission to Mars to its agenda. By the end of 2028, it plans to launch SR-1 Freedom, a spacecraft equipped with nuclear electric propulsion (NEP), to the Red Planet. Once there, it will release a swarm of autonomous helicopters (like the Ingenuity copter that flew with Perseverance) tasked with scouting for water and safe landing zones for future crewed missions.

The spacecraft will be powered by a 20-kilowatt nuclear reactor, using uranium to generate intense amounts of heat that are moved through a turbine to be turned into electricity. A conventional rocket creates an intense burst of acceleration for a spaceship, which then coasts on inertia the rest of the way. By contrast, this spacecraft will generate less intense, but continuous, acceleration–building up speed over the course of the journey.

The bottom line for NEP is that it's theoretically much more efficient. One of the biggest constraints in launching spaceships is the fuel: it’s heavy, and the more you have, the harder it is to escape Earth’s gravity. The weight of the propellant needed with NEP is a fraction of that needed in a conventional spacecraft, explained Nikolaos Gatsonsis, a professor of aerospace engineering at Worcester Polytechnic Institute.

One challenge in getting that efficiency is whether the reactor can be built small enough to actually save on weight, Ray Sedwick, an aerospace engineer at the University of Maryland, explained to me. “Reactors designed to operate in space have many additional technical challenges, not the least of which is keeping the mass down. Saving on propellant mass only helps if your power plant mass doesn’t end up being more massive than what you save.” One source of mass, he said, would be the radiators necessary to dissipate any heat that isn’t used to generate power.

Another challenge for NASA’s ambitious timeline, Gatsonis said, is that a nuclear reactor has never been integrated into an electrically propelled spacecraft before. That said, he notes that 20-kilowatt non-nuclear systems have been demonstrated before, and Sedwick added that this spacecraft is “leveraging a lot of heritage technology.” So that 2028 timeline might be doable “if the development momentum can be maintained.”

Discovery of the Week: Google’s New Memory Shortu

Last week, we talked about how a halt in shipping in the Strait of Hormuz could make AI more expensive. (To recap: AI needs memory chips. Memory chip manufacturers need helium. They mostly get that helium from Qatar. Thanks to the Iran war, they can’t get it.) But researchers at Google may have just developed a way to ease that bottleneck: On Tuesday, the company unveiled TurboQuant, a set of algorithms that enable significant memory compression for the key-value cache of search engines and large language models.

AI systems utilize high-dimensional vectors in order to function. These are complex mathematical representations that define the essential features of a piece of information. For example, a vector expressing a picture of a cat would contain quantifications of its eye color, ear shape, fur length, etc. It’s a useful way to store and retrieve information, but it takes up a lot of memory.

What TurboQuant does is compress these vast quantities of information in more simplified ways, which enables AI models to map and catalog them more quickly. (To very oversimplify this, imagine that in giving directions, instead of telling the model to go three blocks north and three blocks east to get somewhere, I just say, “go about 4.2 blocks northeast.” That latter version is simpler and easier to store.) This produces a significant amount of efficiency–in its tests, the researchers found the same information can be compressed to be about six times smaller without sacrificing performance.

As these algorithms are refined and integrated into AI models, this could mean that companies will need less memory hardware to support their software. That could bring down costs and lead to fewer tradeoffs between performance and economics.

The Iran War And Emerging Tech

While Google’s discovery might ease one business worry from the Iran war, there’s still more economic impacts to come.

Shipping through the Strait of Hormuz is still effectively closed by Iran as its conflict with Israel and the U.S. drags on. As a result, my colleague Christopher Helman reported this week that the world is likely on track for its biggest oil shock since 1973. Already, he wrote, we’re seeing shortages of key products like jet fuel, liquid natural gas and fertilizers. In 1973, the energy shock led to gasoline lines, double-digit inflation and a serious curtailment of economic growth.

Sure enough, gas prices in the U.S. might hit $4 per gallon today, and crude oil prices rose to over $110 a barrel. When fuel prices go up, so does the price of everything else; the OECD now projects inflation will hit 4.2% in the U.S. As a result, markets are now betting that the Federal Reserve will start hiking interest rates before the end of the year.

“But Alex,” you may object at this point. “I come to your newsletter every week to read about cool emerging technologies. What does this dour economic news have to do with cool laser stuff?”

The answer, sadly, is everything. Emerging tech startups need money to keep momentum, and it takes years to bring products to market and start earning revenue. Higher inflation means the same amount of money doesn’t go as far. Higher interest rates pressure investors to minimize risk and find more predictable returns. The upshot? Longer fundraising cycles, smaller rounds and an overall preference for later-stage companies that are closer to commercialization.

In other words, early-stage players are going to find it harder to get the capital they need to keep the lights on. Not only will that slow the pace of innovation overall, it could keep potential game-changing tech off the market for years longer than might have happened otherwise.

The Hot Take: Engineering Is About To Be Transformed

Each week, I ask investors for their take on tech trends within their industries. Today I’m featuring thoughts from Shahin Farshchi, a partner at Lux Capital, which invests in founders that are, according to its website, “equal parts mad scientist and business mastermind.”

What is being overhyped right now?

We’re living in a sea of hype, and it’s all starting to look the same: faster chips, world models, humanoid robots, space. We’re seeing companies raise exorbitant amounts of money doing nearly identical things. Cash will be incinerated. But like any cycle, the infrastructure that gets laid down and the talent that gets trained will go on to do amazing things. The hype is wasteful, but it’s rarely without a legacy.

What should more people be talking about today?

The unsexy part of chips. Tomorrow’s compute will be enabled by advanced packaging, interconnect, and memory–not faster transistors. The magic isn’t in the chip anymore; it’s in how you connect them.

The unsexy part of satellites. Everyone talks about putting them up. Fewer people ask: How do you talk to them securely and reliably at scale? The answer is laser communications. Sure, anyone can build a laser, but how do you track a satellite shooting across the horizon with an optical setup precise down to micron scales? How do you sustain a link at hundreds of gigabits, if not terabits per second, across a weak optical path? That’s the hard part, and it’s largely unsolved.

The robot-model chicken-and-egg problem. World models are powerful, but they’re only useful when mapped to hardware. The problem: How do you build robots without the models, and how do you build models without the robots? The best companies are attacking both in parallel, but it remains one of the hardest open problems in the field–and it needs more great teams working on it.

What are we all going to be talking about in five years?

Software engineering is the first discipline to be overhauled by AI, but it won’t be the last. Every form of engineering, from bridges to dams to chips to aircraft, is about to be transformed. These fields will shift from heuristics accelerated by computer-aided design to every problem being solved from first principles, with every assumption challenged.

Why do planes have to look the way they do? Why can’t a bridge be designed and optimized in days? Does a radio have to be architected around the same basic stages as those from World War II?

Just as new alloys made heavier-than-air flight possible, and the solid-state transistor brought electronics into the home, AI will systematically dismantle the constraints on what can be built, and how fast. The question won’t be “Is this feasible?” It’ll be “Why hasn’t someone built it yet?”

Anthropic vs. The Pentagon, Ctd: A Federal court has temporarily halted the Department of Defense’s designation of Anthropic as a supply chain risk. In the order, Judge Rita Lin wrote that the Department “designated Anthropic as a supply chain risk because of its ‘hostile manner through the press.’ Punishing Anthropic for bringing public scrutiny to the government’s contracting position is classic illegal First Amendment retaliation.” This means that federal contractors who also work with the Pentagon can continue to use Claude as the litigation continues.

The Trouble With AI Detectors: A lot of people–like lawyers–are getting in trouble for using AI to write, raising allegations of plagiarism or, in the case of court documents, citing cases and statutes that don’t exist. But the flip side of the problem is that there’s now an array of products that claim to be able to tell the difference between human and AI writing. But they are not perfect, which means that lots of people are facing career challenges after being falsely accused of writing with AI–especially those who are neurodivergent or for whom English is their second language. I imagine that it’s going to be years before technology and governance comes up with a workable equilibrium.

Antimatter Road Trip: Scientists at CERN successfully loaded up 92 particles of antimatter into a truck and safely drove them around. This is a lot easier said than done, because antimatter is destroyed if it comes into contact with regular matter (the stuff that you, me, and pretty much everything else is made out of). The trick is building a magnetic field that keeps the antiprotons suspended so they don’t come into contact with anything. It’s worth the effort, though–while antimatter can be created at CERN, it can’t be effectively studied there because the particle accelerators’ magnetic fields disrupt the equipment needed to measure its properties. Once antimatter goes mobile, scientists will be able to study it more in depth.

Pro Science Tip: Video Games Might Help You Sleep

Finding yourself a little too stressed to sleep? Consider playing a round of Call of Duty or Fortnite. That’s according to a study in Sleep Medicine, which looked at the sleep patterns of people who don’t normally play video games. The 18 participants first had their sleep logged and assessed in a variety of ways. Then half of them spent three days playing Call of Duty for an hour before bed while the control half watched an action-packed TV series. The research found that the game-playing group reported lower stress levels and better working memory, with no impact on the quality of sleep.

What’s Entertaining Me This Week

Last weekend, I caught Project Hail Mary in theaters. I’m a big fan of the book, and the movie doesn’t disappoint. The premise of both is that there’s an unknown life form that is feasting on our Sun, and Earth sends a team out on a desperate mission to figure out how to stop it. Ryan Gosling does an incredible job in the lead role of the movie–which is important, because he’s the only human character on screen for about 80% of the film. It’s heartful, optimistic and fun–and even more importantly, made with practical sets and effects rather than relying on CGI, which just makes it look so much better.

P.S. In case you’re wondering how the Claude-powered March Madness brackets I talked about last week are doing, the answer is… about as well as if I’d put them together completely on my own. Which is to say, they’re all busted.


© Forbes