Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
According to Tech Review, a number of different "dashboard" websites have popped up recently, displaying a variety of data sources related to the current war in Iran. They are meant to mimic the sorts of dashboard displays used by governments...but these have been "vibe coded" and don't have traceable sources of data. As well as being entertaining, it seems that these are often tied to gambling on various war-related topics. Story at https://www.technologyreview.com/2026/03/09/1134063/how-ai-is-turning-the-iran-conflict-into-theater/ or if paywalled, try https://archive.ph/G0zut
AI-enabled dashboards, combined with prediction markets and fake imagery, are reshaping how war is observed.
[...]As a journalist, I believe these sorts of intelligence tools have a lot of promise. While many of us know that real-time data on shipping routes or power outages exist, it's a powerful thing to actually see it all assembled in one place (though using it to watch a war unfold while you munch on popcorn and place bets turns the war into perverse entertainment). But there are real reasons to think that these sorts of raw data feeds are not as informative as they may feel.
Craig Silverman, a digital investigations expert who teaches investigative techniques, has been keeping a log of these dashboards (he's up to 20). "The concern," he says, "is there's an illusion of being on top of things and being in control, where all you're really doing is just pulling in a ton of signals and not necessarily understanding what you're seeing, or being able to pull out true insights from it." [Silverman page, debunking many things, https://x.com/CraigSilverman ]
One problem has to do with the quality of the information. Many dashboards feature "intel feeds" with AI-generated summaries of complex, ever-changing news events. These can introduce inaccuracies. By design, the data is not especially curated. Instead, the feeds just display everything at once, with a map of strike locations in Iran next to the prices of obscure cryptocurrencies.
Anyone here tried one of these sites? I suspect you need a big monitor...
A 2x4 LEGO brick manufactured in 1958 will snap perfectly onto a brick molded this morning in Denmark, China, Hungary, Mexico, or the Czech Republic. The 66-year-old brick will have the exact same interference fit, the same clutch power, the same 4.8mm stud diameter. This is the result of maintaining mold tolerances to 0.01mm (10 microns) across billions of parts annually.
For hardware engineers developing products with tight-fit mechanical interfaces, LEGO represents an extreme case study in what's possible when you can't compromise on dimensional consistency. A brick that's 0.02mm oversize won't fit into existing structures. A brick that's 0.02mm undersize falls apart when you pick it up. There is no acceptable tolerance range for functional failure. This creates engineering constraints that most consumer products never face. Understanding how LEGO achieves this - and more importantly, where they make deliberate trade-offs - provides practical frameworks for tolerance analysis, mold design, and manufacturing process control.
The frequently cited "0.002mm tolerance" is misleading without context. LEGO's actual mold precision is 10 microns, but different features have different critical tolerances. The cylindrical studs on top are 4.8mm in diameter with a tolerance of ±0.01mm. The hollow tubes underneath create the interference fit that makes bricks stick together. Standard bricks are 9.6mm tall, and three plates stack to exactly one brick height. The cumulative tolerance across a stack of 100 bricks determines whether a tall structure maintains dimensional accuracy.
(15 Dec 2025) Fortescue Infinity Train gets 14.5 MWh battery that never needs charging [update]
Goes loaded downhill and recharges the 14.5 MWh battery by "regenerative braking" with enough energy to drag the empty train cars uphill - didn' quite fully work.
Fortescue launched two battery-electric locomotives this week, rounding out its fleet of 70 diesel-powered machines hauling precious iron-ore from pit to port.
...The locomotive's battery is the equivalent of "200 to 300 average electric vehicles" and capable of powering a refrigerator for 30 years, according to Mr Otranto.
...
The locomotives, purpose-built by Caterpillar subsidiary Progress Rail, boast what Fortescue has called the world's largest land-mobile battery, with a capacity of 14.5 megawatt-hours.The pair will save the company 1 million litres of diesel each year, still just a fraction of the 80 million litres the company consumes annually.
"It is a large undertaking: these take probably a couple of years to manufacture, so once we pull an order, you can see it will take a couple of years to transition the entire fleet," Mr Otranto said.
The company hopes to complete the transition ahead of its "real-zero" deadline of 2030.
...
The locomotives' massive battery will be charged in two ways.The first is via Fortescue's growing renewable energy apparatus, which it says it is expanding aggressively at a rate of more than 3,000 solar panels a day.
The second charging method is through regenerative braking, a mechanism drawing closely from the company's stalled "Infinity Train" concept.
Andrew Forrest had previously touted an in-house electric rail model, developed with Australian engineering firm Downer Group, that generated all the power it needed using the uphill-downhill dynamics of the Pilbara ranges.
The project was canned, however, in September, axing more than 100 staff.
https://arstechnica.com/science/2026/03/ig-nobels-ceremony-moves-to-europe-over-security-concerns/
Every year, we have a blast covering a fresh crop of winners of the Ig Nobel prizes. After 35 years in Boston, the annual prize ceremony will take place in Zurich, Switzerland, this year and will continue to be held in a European city for the foreseeable future. The reason: concerns about the safety of international travelers, who are increasingly reluctant to travel to the US to participate.
"During the past year, it has become unsafe for our guests to visit the country," Marc Abrahams, master of ceremonies and editor of The Annals of Improbable Research magazine, told The Associated Press. "We cannot in good conscience ask the new winners, or the international journalists who cover the event, to travel to the US this year."
Established in 1991, the Ig Nobels are a good-natured parody of the Nobel Prizes; they honor "achievements that first make people laugh and then make them think." As the motto implies, the research being honored might seem ridiculous at first glance, but that doesn't mean it's devoid of scientific merit. The unapologetically campy awards ceremony features miniature operas, scientific demos, and the 24/7 lectures, in which experts must explain their work twice: once in 24 seconds and again in just seven words.
Traditionally, the awards ceremony and related Ig Nobel events have taken place in Boston at Harvard University, Massachusetts Institute of Technology, and Boston University. However, four of last year's 10 winners opted to skip the ceremony rather than travel to the US, and the situation has not improved.
Nor is it just the Ig Nobels being affected by the hostile US environment for international travel. Many international gaming developers are choosing to skip this year's weeklong Game Developers Conference in San Francisco, citing similar concerns. "I honestly don't know anyone who is not from the US who is planning on going to the next GDC," Godot Foundation Executive Director Emilio Coppola, who's based in Spain, told Ars. "We never felt super safe, but now we are not willing to risk it."
So this year, the Ig Nobel organizers are joining forces with the ETH Domain and the University of Zurich for hosting duties. "Switzerland has nurtured many unexpected good things—Albert Einstein's physics, the world economy, and the cuckoo clock leap to mind—and is again helping the world appreciate improbable people and ideas," Abraham said.
The Ig Nobels will not be returning to the US any time soon. Instead, the plan is for Zurich to host every second year; every odd-numbered year, the ceremony will be hosted by a different European city. Abraham likened the arrangement to the Eurovision Song Contest.
Drawing on the latest evidence, the authors show that neither view is supported by available data and argue that persistently low fertility can be sustainable and even economically desirable.
In their piece, published in Nature Human Behaviour, IIASA Distinguished Emeritus Research Scholar Wolfgang Lutz and IIASA Senior Researcher Guillaume Marois, who is also an associate professor at the Asian Demographic Research Institute of the Shanghai University, respond to political and public concern over declining birth rates in highly developed countries. While low fertility is increasingly framed as a crisis, associated with population ageing, labor shortages, and fiscal pressure, the authors argue that this narrative is based on outdated assumptions that no longer reflect current demographic realities.
A central motivation for the paper was the widespread belief based on earlier studies that fertility would recover as human development continues. However, using the most recent data up to 2023, the authors demonstrate that this pattern has reversed. Today, the global cross-sectional relationship is clearly negative: the higher a country's Human Development Index, the lower its fertility tends to be.
"This finding came as a surprise to much of the demographic community," says Marois. "Even countries once considered models for balancing work and family life, such as the Nordics, have experienced unexpectedly steep fertility declines. The idea that development alone will bring fertility back up simply doesn't hold anymore."
The commentary also questions the normative status of replacement-level fertility, often defined as 2.1 children per woman. This benchmark, the authors argue, is an artificial construct that only leads to long-term population stability under unrealistic assumptions, notably the absence of further mortality decline. More importantly, population stability does not automatically translate into economic or social wellbeing.
Instead, the authors emphasize that economic sustainability depends more on population structure than on population size. Higher levels of education, increased labor force participation, and rising productivity can offset – and even outweigh – the effects of having fewer births. Lower fertility can enable greater investment per child, strengthening human capital and innovation while reducing dependency burdens over the coming decades.
The policy implications are clear. While pro-natalist measures can improve family wellbeing, increasing fertility should not be their main objective, as their impact on fertility is typically modest and higher fertility does not necessarily improve economic wellbeing. Governments should instead adapt social security, labor market, and pension systems to the reality of sustained low fertility, while strengthening investments in education and productivity. This shift is particularly relevant for countries such as South Korea, China, and Japan, which currently record some of the world's lowest fertility rates and face especially intense political pressure to raise birth numbers.
"Our message is not that low fertility is inherently good or bad," concludes Lutz. "There is no single 'ideal' fertility level that guarantees prosperity. Instead of trying to push birth rates back to an arbitrary target, governments should focus on adapting social security systems to the changing demographic realities and invest strongly in education and productivity. Under those conditions, societies can thrive even with fewer births."
Journal Reference: Marois, G., Lutz, W. (2026). Low fertility may persist and could be good for the economy. Nature Human Behaviour DOI: https://www.nature.com/articles/s41562-026-02423-6
Since mid-2025, Oracle, Crusoe, and OpenAI have discussed increasing data center power capacity from about 1.2 GW to roughly 2.0 GW, amid reluctance from locals. Negotiations got complicated due to difficult financing terms and OpenAI's shifting capacity forecasts, which led to their collapse, according to Bloomberg. Nonetheless, development of the 1,000-acre campus remains underway, and multiple facilities are already in service, though preliminary agreements to rent a substantial expansion were ultimately dropped. Could it be a signal that the Stargate project fails while the whole AI industry is on the rise?
The Abilene campus remains one of the biggest AI data center projects announced so far, yet to date, it has been known primarily as a part of the widely publicized Stargate project. Oracle has been rapidly installing Nvidia-based servers used by OpenAI to train and deploy AI models and systems. However, relations between Oracle and Crusoe have been strained by reliability issues. Earlier this year, winter weather disrupted parts of the liquid-cooling infrastructure, forcing several buildings offline for multiple days. Both companies say cooperation remains strong and development continues swiftly, yet the source report clearly notes hiccups.
Given the rising tensions between the Stargate partners, Crusoe began searching for another tenant, according to Bloomberg. At that point, Nvidia reportedly stepped in to help ensure the site would continue deploying its hardware rather than systems powered by AMD. Furthermore, Nvidia provided Crusoe with a $150 million deposit and assisted efforts to attract Meta — which is not a part of the Stargate project — as a prospective tenant for the additional capacity, the report says. Meanwhile, Meta has yet to confirm its expansion at the Abilene campus.
Despite shelving the expansion of one Stargate project, Oracle's general partnership with OpenAI remains unchanged. In July last year, Oracle agreed to develop 4.5 GW of data center capacity for OpenAI, and that program continues. The companies have also announced projects in other locations, including a site near Detroit owned by Related Digital.
*One gigawatt is comparable to the output of a nuclear reactor and can supply electricity to hundreds of thousands of homes at peak usage. That being said, a nuclear power plant was not reported to be a part of negotiations, which perhaps explains why locals were against increasing power capacity using things like coal or gas generators, yet we are speculating here.
When personalized ads get too intrusive, consumers are less likely to buy:
Years into the grand experiment of personalized digital marketing, most of us have had the experience: You search for a product — or just casually mention it. Suddenly, ads for that exact item stalk you across apps, websites, and social media. The targeting may be technically impressive, but it can feel unsettling.
That uneasy sentiment is the center of new research by Wayne Hoyer, professor of marketing and James L. Bayless/W.S. Farish Fund Chair for Free Enterprise at the McCombs School of Business at The University of Texas at Austin. He finds that when digital personalization crosses perceived boundaries, it triggers a powerful emotional response, which he calls "creepiness." That response can backfire on digital marketers by materially reducing consumers' willingness to buy.
The study, conducted by Hoyer and three marketing researchers from the University of Bern in Switzerland — Alisa Petrova, Lucia Malär, and Harley Krohmer — argues that creepiness is not a property of digital marketing itself. Instead, it is a structured emotional episode that unfolds inside the consumer in response to marketing.
The response has two parts: feeling ambiguous about what's behind a marketing message, then deciding it's a threatening form of surveillance.
"When consumers are exposed to these ads, they make an assessment of ambiguity, such as, 'What is this?' and whether this is intrusive surveillance, such as, 'Are they watching me?'" Hoyer explains.
"If the answer is yes, this creates a negative emotion that can negatively affect purchase intentions."
[...] What can brands do to mitigate feelings of creepiness?
In a final experiment, the researchers tested a variety of remedies, such as transparency about data use, assurances of good intentions, offers of discounts, and charitable donations. They also tried including positive emotional images in ads: pictures of kittens.
Perhaps unsurprisingly, the kittens proved somewhat effective at de-creeping consumer reactions and softening damage to their plans to buy. Offering monetary compensation also helped.
Overall, however, even the best interventions made only limited improvements to purchase intentions. Hoyer says, "Creepiness is robust and difficult to mitigate once triggered."
This means that prevention is key, he says. It's more effective to avoid creating bad feelings in the first place than try to repair them after the fact.
"Managers should focus on prevention by designing personalization practices that minimize ambiguity and try to avoid signals of intrusive surveillance," Hoyer says.
The study suggests developing a Creepiness Level Index as a tool to help marketers track negative reactions to digital ads.
Over the long run, though, the marketing risk might diminish, he adds. "It is possible that creepiness will decline as consumers become more used to personalization and more accepting of AI technology."
Journal Reference: https://doi.org/10.1002/mar.70089
The Shahed 136 drone was invented by Iran and then copied by the US, but was originally a cold war weapon.
Iran invented the relatively simple Shahed 136 attack drone, but is now fending off US copies launched against it in combat. Why, when the US military has expensive, cutting-edge and hi-tech weapons, is it making flimsy drones powered by a motorbike engine?
Iranian company Shahed Aviation Industries originally designed the 136. It is 2.6 metres long and can carry 15-kilogram payloads over distances of about 2500 kilometres. It travels at a relatively modest speed of around 185 kilometres per hour – far slower than cruise missiles or bomb-carrying aircraft. But it has the advantage of extremely low cost – perhaps as low as $50,000 per unit.
Shaheds are now used in their hundreds in daily strikes on Ukraine by Russia, requiring layers of air defence – including fighter jets, machine guns, missiles and interceptor drones – to try to bring them down before they hit civilian or military targets. They are even in use by Houthi forces in Yemen.
Iran has been using Shahed drones as well as a range of other hardware in attacks around the Gulf this week in retaliation for US and Israeli strikes. In return, the US military has used its Low-cost Uncrewed Combat Attack System (LUCAS), produced by Arizona-based Spektreworks, in combat for the first time against Iran, which is a reverse-engineered copy of the Shahed 136. This means that Iran’s own design is now being used against it.
[...] The US reportedly reverse-engineered the drone after capturing units from Iranian-backed militias in Iraq and Syria, and it was successfully test launched from a US Navy ship last year.
[...] “You’re knocking them out of the sky with ordnance that’s way more expensive not just than the Shahed, but sometimes it’s more expensive than the thing that the Shahed is actually hitting,” says King. “There have been loads of cases where the target the Shahed is hitting is cheaper than the Patriot missile [used to take it down]. The appearance of these kind of crude, but effective, remote systems changes the economic calculus of war in an interesting way.”
[...] Ian Muirhead at the University of Manchester, UK, who previously spent 23 years in the military, says that Shahed drones will never replace crewed aircraft or highly advanced missiles, but that they are increasingly finding a place in combat and that western militaries are learning lessons from the war in Ukraine and adopting similar weapons.
“A lot of modern weapons are extremely complex and expensive, and if you’re having large-scale conflicts like this, having lots of cheap, expendable weapons – particularly if you don’t have big armies any more – is more effective,” says Muirhead. “If you can send a thousand of them, you can overwhelm defences with cheap munitions.”
“It’s just economics: if it costs you 10 times more for your defence than it is for your attackers, you’re never going to be able to outpace the other side,” says Muirhead.
Big Tech is set to agree to build its own power plants for data centers and shield consumers from rising electricity costs, but companies face daunting logistical obstacles to delivering on the pledge championed by President Donald Trump.
At a White House event on Wednesday, executives from Amazon, Google, Meta, Microsoft, xAI, Oracle, and OpenAI are due to sign the pledge to supply their own power instead of relying on a grid connection.
Trump hailed the plan in his State of the Union speech last week, promising US consumers that "no one's prices will go up" as a result of "energy demand from AI data centers."
But industry executives have suggested the commitment will not be binding, while experts warn it is likely impossible to fully insulate consumers from the extra power demand coming from the vast expansion of data centers to run AI.
"Regardless of how these data centers connect, behind the meter or as part of the network, you're going to increase demand," said Ari Peskoe, director at Harvard Law School's Electricity Law Initiative.
Independent power supplies for data centers most often come from gas turbines, which are in short supply and not always designed to provide continuous power. "We still need more of these turbines," Peskoe added.
Trump's pressure on big data center operators comes in response to consumer backlash and political pressure over rising power bills.
On the campaign trail in 2024, Trump pledged to cut energy bills in half within a year of taking office.
In reality, residential electricity costs rose by 6 percent nationwide in February, compared with a year before, according to the US Energy Information Administration.
States such as New Jersey and Pennsylvania, which have clusters of data centers, reported bigger increases at 16 percent and 19 percent respectively.
Natural gas prices, extreme weather, and the need to upgrade aging grid infrastructure have all contributed to higher costs—after decades of low investment in power plants and transmission lines. The hit to energy supplies from Trump's war against Iran could add to the problem.
Critics of data centers say they are increasing energy bills by adding to demand. US data center power demand will more than triple by 2035, rising from almost 35 gigawatts in 2024 to 106 GW, according to data from BloombergNEF.
To avoid political backlash and waits of up to four years for grid connections, tech companies are already building their own power supplies for many new data centers.
Nearly three-quarters of planned generation equipment for data centers is natural gas fired, according to energy research firm Cleanview, which is tracking 56 GW of projects across the US.
Wednesday's pledge would see tech companies expand these efforts to prevent higher power costs being pushed on to customer bills.
Josh Price, director of energy and utilities at strategy firm Capstone, said Big Tech was "trying to push back against the narrative that they're the bad guy."
But the boom in data center building is already pushing the limits of the supply chain for power generation, making it difficult for companies to meet their commitment to Trump.
Competition for gas turbines is fierce, with waits as long as seven years for new orders.
Turbine-maker GE Vernova said it would expand production by 25 percent, and Mitsubishi Power announced plans to double its output over the next two years. But manufacturers have been cautious about expanding capacity, and it may not be enough to meet booming demand.
Two-thirds of gas projects in development in the US have not announced a turbine manufacturer, according to Global Energy Monitor.
The price of gas turbines has risen sharply, and greater competition from tech companies will mean higher costs for utilities and industrial customers who also need generating capacity—costs that could still be passed on to ratepayers.
To overcome shortages, data centers are increasingly relying on alternatives. Companies, including Google and Microsoft, have also struck deals to reopen nuclear power plants, but these plans will take years to deliver.
In the near term, companies are using options such as reciprocal engines and diesel generators. Experts point out that these power sources, as well as ordinary gas turbines, are not designed to provide the kind of continuous power needed by data centers.
"They say, 'we have documented evidence that these can run 90 percent of the time'... But that's not the average use case," said Jigar Shah, an energy investor and former Department of Energy official.
Keeping these data centers, and their power supplies, operational for decades would also present challenges around securing spare parts and qualified technicians, he added.
Shah said: "The level of ineptitude by which the data center companies are sleepwalking into major problems just seems shocking for trillion-dollar companies."
AWS confirmed on its health dashboard that two facilities in the UAE were "directly struck" and that a third site in Bahrain sustained damage from a nearby explosion. The strikes caused structural damage, disrupted power delivery, and, in some cases, triggered fire suppression systems that produced additional water damage, according to the AWS Health Dashboard. Amazon told customers it expects recovery to be prolonged "given the nature of the physical damage involved".
These outages then cascaded into consumer-facing services across the Gulf. Ride-sharing and delivery platform Careem, payments firms Hubpay and Alaan, data management company Snowflake, and several major UAE banks — including Emirates NBD, First Abu Dhabi Bank, and Abu Dhabi Commercial Bank — all reported disruptions. AWS advised customers to activate disaster recovery plans and migrate workloads away from the affected Middle East regions.
Iran's Islamic Revolutionary Guard Corps stated it targeted the Bahrain facility specifically because AWS hosts U.S. military workloads there; AWS declined to comment on that claim. Sean Gorman, Air Force contractor and CEO of Zephr.xyz, told DefenseScoop on Tuesday that classified government workloads at Impact Level 4 and 5 are held in U.S.-only facilities, but acknowledged that “contractor and non-operational data… may have been impacted,” at the struck sites.
The attacks followed joint U.S.-Israeli strikes on Iran over the last week. AWS urged customers with workloads in the region to migrate to unaffected regions while repairs continue.
https://www.slashgear.com/2116892/canada-discovery-botswana-rare-earth-minerals-tech-companies-want/
Rare earth elements (REE) are a group of 17 metallic elements that are increasingly important in the modern world. This importance is fueling an increasing global demand for these essential commodities. The figures speak for themselves: Worldwide REE production reached 390,000 metric tons in 2024 — nearly triple the level recorded in 2017. Currently, the vast majority of these important minerals are mined in China, which produced 69% of the total production in 2024.
Now, a new discovery by Canadian company Tsodilo Resources Limited could bring a new player into the market — Botswana. This is important, as much of the modern world relies on these elements. Technologies like wind turbines, telecommunication systems, the defense industry, and many high-tech consumer industries all rely on them, including in the rare earth magnets used in electric vehicles.
In an announcement, Tsodilo said that drilling at its Gcwihaba Metals Project identified high-grade mineralization at two exploration targets known as C26 and C27. The deposit, located at depths of between 20 and 50 meters (66 and 164 feet) below the surface, is located in what geologists call a skarn system — a type of metamorphic rock formation that's been altered by hot and chemically active fluids like magma.
Company data indicates that the deposit contains all 15 REEs listed as critical by the U.S. Geological Survey, along with more common metals like copper, cobalt, nickel, and silver. The project remains in the exploration stage, with further drilling planned to define its full scale.
Despite the name, rare earths are not actually that rare. For instance, cerium — an element used in catalytic converters — is actually the planet's 25th most abundant element. However, what is rare is finding REEs in high enough concentrations that mining becomes viable. The initial findings from the Gcwihaba sites suggest that this is one of these rare instances.
[...] Despite the announcement, the Gcwihaba project is still in the exploration stage. Tsodilo has outlined a conceptual exploration target based on drilling at the C26 and C27 zones, but it hasn't yet published a compliant mineral resource estimate. Further drilling is planned to better define the size, grade, and economic viability of the project.
Even if additional studies confirm the presence of a viable REE resource, we are unlikely to see Botswanan dysprosium anytime soon. Factors like environmental assessments, permitting, infrastructure development, and financing all add time and complexity. Then there's the processing to consider. As noted, China accounts for about 69% of the world's REE production. However, when it comes to the separation and processing these minerals, China accounts for about 90% of the figure.
Still, Botswana's reputation as one of Africa's more mining-friendly countries could work in the project's favor. The country has already got a thriving diamond mining industry — it's the world's second biggest producer of the gem — but recent market uncertainties have seen it look to diversify.
While the discovery does not immediately alter the global REE balance, it represents another potential source at a time when governments and technology companies are actively looking to reduce supply chain risk and ease the potential threat of China grinding worldwide car production to a halt.
With no enforcement and questionable economics, it may not make a difference:
On Wednesday, the Trump administration announced that a large collection of tech companies had signed on to what it's calling the Ratepayer Protection Pledge. By agreeing, the initial signatories—Amazon, Google, Meta, Microsoft, OpenAI, Oracle, and xAI—are saying they will pay for the new generation and transmission capacities needed for any additional data centers they build. But the agreement has no enforcement mechanism, and it will likely run into issues with hardware supplies. It also ignores basic economics.
Other than that, it seems like a great idea.
The agreement is quite simple, laying out five points. The key ones are the first three: that the companies building data centers pledge to pay for new generating capacity, either building it themselves or paying for it as part of a new or expanded power plant. They'll also pay for any transmission infrastructure needed to connect their data centers and the new supply to the grid and will cover these costs whether or not the power ultimately gets used by their facilities.
The companies also pledge to consider allowing the local grid to use on-site backup generators to handle emergency power shortages affecting the community. They will also hire and train locally when they build new data centers.
The agreement suggests that these promises will protect American consumers from price hikes due to the expansion of data centers and will somehow "lower electricity costs for consumers in the long term." How that will happen is not specified.
Also missing from the agreement is any sort of enforcement mechanism. If a company decides to ignore the agreement, the worst it is guaranteed to suffer is bad publicity, something these companies already have experience handling. That said, Trump has been known to resort to blatantly illegal tactics to pressure companies to conform to his wishes, so ignoring the agreement carries risks.
[...] As recent coverage has made clear, most of the companies plan to handle (or are already handling) the added power demand with natural-gas-generating equipment. But there's a limited supply of such equipment; various sources quote wait times of up to seven years. That's longer than even the planned timeline for a new nuclear power plant. While there's likely to be some expanded manufacturing capacity due to the surging demand, the companies that build gas turbines will be very hesitant to invest in meeting demand that is likely to be transient.
Even if they did, basic economics indicate that expanded use of this fuel would raise consumer costs, as it would mean more competition for the supply of natural gas used either directly by consumers for heating or by grid operators that supply consumers with electricity. That will likely force utilities to meet demand with plants that are rarely used at present, typically because those plants are less efficient and more expensive to run.
[...] So it's very unlikely that data center builders will be able to meet the added demands with natural gas. Even if they could, it would shift costs onto consumers unless we somehow scaled natural gas production to match while keeping overseas consumers from buying the excess.
Are there alternatives? We haven't built any coal plants in decades, and many of the ones still in operation are reaching their normal end-of-life points; the electricity they produce is also expensive relative to the alternatives. It's highly unlikely that anyone would invest in new coal plants given the cost and environmental consequences.
Nuclear is a questionable option. There are plans to restart a couple of shuttered plants—the secretary of energy is holding a press conference at one at Indian Point in New York on Friday, suggesting further efforts in that regard. But there simply aren't enough shuttered plants to make a difference. The administration is promoting small modular nuclear power and hopes to have some test reactors built within the next few years. But it will likely take considerably longer than that before they can be widely deployed.
That leaves a combination of solar and batteries as one of the most viable alternatives, although it's still more expensive than most natural gas plants at present. That combination is already being installed at record paces—solar output in the US has grown by over 30 percent for two years in a row. It's not clear how much faster we could be installing them, and in any case, it's clear that the administration doesn't view them as a solution. The announcement specifically takes a shot at policies favoring renewable energy, saying, "President Trump terminated the job-killing 'Green New Scam,' ended massive taxpayer subsidies for unreliable energy sources, and rescinded the Biden Administration's anti-American energy regulations."
Whatever is built will also face the general challenge that transmission remains a huge problem, with many proposed power plants waiting in the queue for years for interconnects to the wider grid.
Supplying the data center boom with power in a way that's minimally disruptive to the wider public will be an extremely difficult challenge. Dismissing it with a toothless agreement that hopes the companies involved will solve all the problems is not cause for optimism that we're prepared to meet that challenge.
It's Official: The Cybertruck is More Explosive Than the Ford Pinto:
We now have a full year of data for the Cybertruck, and a strange preponderance of headlines about Cybertrucks exploding into flames, including several fatalities. That's more than enough data to compare to the Ford Pinto, a car so notoriously combustible that it has become a watchword for corporate greed.
Let's start with the data summary, then we'll do a deep dive.
TL;DR: The CyberTruck is 17 times more likely to have a fire fatality than a Ford Pinto
With that maddening statistic out of the way, let's dive into the numbers.
Here's the table, with all sources linked below.
CyberTruck and Ford Pinto Fire Fatalities
+--------------------------+-------------+---------------+-----------------------+
| Vehicle Model | Total Units | Reported Fire | Fatality Rate |
| | | Fatalities | (Per 100,000 units) |
+--------------------------+-------------+---------------+-----------------------+
| Tesla Cybertruck | 34,438 | 5 | 14.52 |
| Ford Pinto (1971-1980) | 3,173,491 | 27 | 0.85 |
+--------------------------+-------------+---------------+-----------------------+
A thin, soft and slippery layer of clay-rich mud embedded in rock below the seafloor intensified the 2011 Japan earthquake that produced a tsunami, killing tens of thousands of people and damaging the Fukushima Daiichi nuclear power plant.
The discovery, which is published in the journal Science, was made by an international team of scientists including researchers from The Australian National University (ANU).
Onboard the world's most advanced drilling-equipped science vessel, Chikyu, the team sailed to the Japan Trench in late 2024 to investigate what caused the Tōhoku-oki fault to rupture and trigger the earthquake.
The re searchers drilled up to 7,906 metres below the sea surface, setting a Guinness World Record for the deepest scientific ocean drilling ever conducted.
Earth core samples recovered from in and around the fault zone reveal that the fault rupture occurred in a layer of clay no more than a few metres thick.
According to ANU geophysicist Associate Professor Ron Hackney, the clay is very soft, slippery and exceptionally weak – a discovery that was "surprising and unusual".
He said this is the first time scientists have linked the presence of soft and slippery clay in a fault plane to ancient sediments deposited on the seafloor over millions of years.
"This work helps explain why the 2011 earthquake behaved so differently from what many of our models predicted," Associate Professor Hackney, who is also Director of the Australian and New Zealand International Scientific Drilling Consortium (ANZIC), said.
According to the scientists, learning more about the properties and nature of a fault plane can tell them how much of the fault plane might rupture during an earthquake and where the energy released during an earthquake will be concentrated along the fault.
This, in turn, provides greater insights into the processes and properties that control giant earthquakes, the resulting movement of the seafloor and tsunami generation, and the likely size and extent of any tsunami that might be triggered.
"This clay-rich ancient mud formed from microscopic particles that slowly settled on the seafloor beneath the Pacific Ocean over time – a process that took place over 130 million years – as the Pacific tectonic plate slowly moved west to ultimately be forced under Japan," Associate Professor Hackney said.
"The fault zone formed in that weak layer of clay as those sediments slowly slid under Japan, moving roughly 10 centimetres a year.
"Given that the weak clay layer is sandwiched between stronger layers of rock above and below, the clay acted like a natural 'tear line' that caused the fault to form within that layer of clay."
The 2011 Japan earthquake was the result of a steady build-up of stress over the hundreds of years since the previous earthquake in a never-ending cycle caused by the movement of the Pacific tectonic plate as it pushed under the tectonic plate on which Japan sits.
According to Associate Professor Hackney, once the built-up stress was abruptly released, the weak nature of the clay offered little resistance to the rupture generated, allowing that rupture to rapidly propagate up the fault, all the way to the seafloor.
This caused the seafloor to rise by several metres, which in turn triggered a tsunami on a scale not expected for this region.
"Amazingly, the fault didn't rupture the whole layer of clay, which extends for hundreds of kilometres along the Japan Trench – the deep ocean boundary where the Pacific and Japan tectonic plates collide with one another," he said.
"The rupture plane was just a centimetre or so thick, yet it allowed between 50 and 70 metres of movement on the fault and caused the seafloor off Japan to rise abruptly by several metres during the earthquake."
By learning more about the properties of the Tōhoku-oki earthquake fault, scientists hope to conduct better assessments of earthquake and tsunami hazards for coastal communities around the world.
"There are indications that the sediments being drawn towards and under Sumatra may also contain a weak clay layer, which suggests that the giant 2004 Boxing Day tsunami may be linked to similar fault characteristics," Associate Professor Hackney said.
"Although we can't be sure without extracting and analysing core samples directly from that fault."
The research team has also published a documentary [Not reviewed -- Ed] taking viewers behind the sciences of their epic expedition. The film follows the international team of researchers onboard Chikyu as they recover samples from beneath the Japan seafloor.
Journal Reference: https://doi.org/10.1126/science.ady0234
https://www.tomshardware.com/tech-industry/norwegian-consumer-watchdog-calls-out-enshittification
Claims Hardware Deliberately Degraded After Purchase
Alongside the report, the Forbrukerrådet and 28 co-signers — including the Electronic Frontier Foundation, Access Now, and Cory Doctorow — sent an open letter to EU policymakers on February 27, urging stronger enforcement of the Digital Markets Act and the GDPR, and pushing back against the European Commission's "Digital Omnibus" package, which the letter argued risks diluting existing consumer protections.
The collective is pushing toward the EU Digital Fairness Act, which the Commission included in its 2026 work program with a proposal expected in Q4 2026. The act is expected to target dark patterns, influencer marketing, addictive design, and unfair personalization across digital products and services.
A public consultation that closed in October 2025 drew roughly 3,000 responses in its first two weeks alone, many from gamers pushing for provisions that would prevent publishers from disabling titles consumers have already purchased — a campaign known as Stop Killing Games.