Posted by EditorDavid from Slashdot
From the winging-it department: Science magazine reports that hummingbird feeders "have become a major evolutionary force," according to research published this week in Global Change Biology. (At least for the Anna's hummingbird, a common species in the western U.S.
Over just a few generations, their beaks have dramatically changed in size and shape.... [A]s feeders proliferated, Anna's hummingbird beaks got longer and larger, which may reflect an adaptation to slurp up far more nectar than flowers can naturally provide. Developing a bigger beak to access feeders "is like having a large spoon to eat with," says senior author Alejandro Rico-Guevara, an evolutionary biologist at the University of Washington. This change was more pronounced in areas where feeders were dense. But in birds that lived in colder regions north of the species' historical range, the researchers spotted the opposite trend: Their beaks became shorter and smaller. This finding also makes sense: The researchers used an infrared camera to show for the first time that hummingbirds use their beaks to thermoregulate, by dissipating heat while they are perched. A smaller beak has less surface area — and would therefore help conserve heat...
The most surprising finding, though, was how quickly these changes took place. By the 1950s, hummingbirds were noticeably different from those of the 1930s: a time span of only about 10 generations of birds, Alexandre says.
Carleton University animal behaviorist Roslyn Dakin (who wasn't involved with the study) says the new paper beautifully shows "evolution in action" — and adds nuance to our conception of humans as an evolutionary force. "I think we're going to find more and more examples of contemporary and subtle changes, that we're shaping, indirectly, in many more species."
Thanks to long-time Slashdot reader sciencehabit for sharing the article.
Posted by EditorDavid from Slashdot
From the following-the-instructions department: SiFive was one of the first companies to produce a RISC-V chip. This week they announced a new collaboration with Red Hat "to bring Red Hat Enterprise Linux support to the rapidly growing RISC-V community" and "prepare Red Hat's product portfolio for future intersection with RISC-V server hardware from a diverse set of RISC-V suppliers."
Red Hat Enterprise Linux 10 is available in developer preview on the SiFive HiFive Premier P550 platform, which they call "a proven, high performance RISC-V CPU development platform."
The SiFive HiFive Premier P550 provides a proven, high performance RISC-V CPU development platform. Adding support for Red Hat Enterprise Linux 10, the latest version of the world's leading enterprise Linux platform, enables developers to create, optimize, and release new applications for the next generation of enterprise servers and cloud infrastructure on the RISC-V architecture...
SiFive's high performance RISC-V technology is already being used by large organizations to meet compute-intensive AI and machine learning workloads in the datacenter... "With the growing demand for RISC-V, we are pleased to collaborate with SiFive to support Red Hat Enterprise Linux 10 deployments on SiFive HiFive Premier P550," said Ronald Pacheco, senior director of RHEL product and ecosystem strategy, "to further empower developers with the power of the world's leading enterprise Linux platform wherever and however they choose to deploy...."
Dave Altavilla, principal analyst at HotTech Vision And Analysis, said "Native Red Hat Enterprise Linux support on SiFive's HiFive Premier P550 board offers developers a substantial enterprise-grade toolchain for RISC-V.
"This is a pivotal step forward in enabling a full-stack ecosystem around open RISC-V hardware.
< This article continues on their website >
Posted by BeauHD from Slashdot
From the prompt-theory department: Google's new AI video tool, Veo 3, is being used to create hyperrealistic videos that are now flooding the internet, terrifying viewers "with a sense that real and fake have become hopelessly blurred," reports Axios. From the report: Unlike OpenAI's video generator Sora, released more widely last December, Google DeepMind's Veo 3 can include dialogue, soundtracks and sound effects. The model excels at following complex prompts and translating detailed descriptions into realistic videos. The AI engine abides by real-world physics, offers accurate lip-syncing, rarely breaks continuity and generates people with lifelike human features, including five fingers per hand.
According to examples shared by Google and from users online, the telltale signs of synthetic content are mostly absent.
In one viral example posted on X, filmmaker and molecular biologist Hashem Al-Ghaili shows a series of short films of AI-generated actors railing against their AI creators and prompts. Special effects technology, video-editing apps and camera tech advances have been changing Hollywood for many decades, but artificially generated films pose a novel challenge to human creators. In a promo video for Flow, Google's new video tool that includes Veo 3, filmmakers say the AI engine gives them a new sense of freedom with a hint of eerie autonomy. "It feels like it's almost building upon itself," filmmaker Dave Clark says.
Posted by BeauHD from Slashdot
From the crystal-ball department: An anonymous reader quotes a report from TechCrunch: One of Microsoft's latest AI models can accurately predict air quality, hurricanes, typhoons, and other weather-related phenomena, the company claims. In a paper published in the journal Nature and an accompanying blog post this week, Microsoft detailed Aurora, which the tech giant says can forecast atmospheric events with greater precision and speed than traditional meteorological approaches. Aurora, which has been trained on more than a million hours of data from satellites, radar and weather stations, simulations, and forecasts, can be fine-tuned with additional data to make predictions for particular weather events.
AI weather models are nothing new. Google DeepMind has released a handful over the past several years, including WeatherNext, which the lab claims beats some of the world's best forecasting systems. Microsoft is positioning Aurora as one of the field's top performers -- and a potential boon for labs studying weather science. In experiments, Aurora predicted Typhoon Doksuri's landfall in the Philippines four days in advance of the actual event, beating some expert predictions, Microsoft says. The model also bested the National Hurricane Center in forecasting five-day tropical cyclone tracks for the 2022-2023 season, and successfully predicted the 2022 Iraq sandstorm.
While Aurora required substantial computing infrastructure to train, Microsoft says the model is highly efficient to run. It generates forecasts in seconds compared to the hours traditional systems take using supercomputer hardware. Microsoft, which has made the source code and model weights publicly available, says that it's incorporating Aurora's AI modeling into its MSN Weather app via a specialized version of the model that produces hourly forecasts, including for clouds.
Posted by BeauHD from Slashdot
From the solar-boom department: In early 2025, U.S. solar power production jumped 44% compared to the previous year, driven by end-of-year construction to capture tax incentives and long-term cost advantages. "The bad news is that, in contrast to China, solar's growth hasn't been enough to offset rising demand," notes Ars Technica. "Instead, the US also saw significant growth in coal use, which rose by 23 percent compared to the year prior, after years of steady decline." From the report: Short-term fluctuations in demand are normal, generally driven by weather-induced demand for heating or cooling. Despite those changes, demand for electricity in the US has been largely flat for over a decade, largely thanks to gains in efficiency. But 2024 saw demand go up by nearly 3 percent, and the first quarter of 2025 saw another rise, this time of nearly 5 percent. It's a bit too early to say that we're seeing a shift to a period of rising demand, but one has been predicted for some time due to rising data center use and the increased electrification of transportation and appliances.
Under those circumstances, the rest of the difference will be made up for with fossil fuels. Running counter to recent trends, the use of natural gas dropped during the first three months of 2025. This means that the use of coal rose nearly as quickly as demand, up by 23 percent compared to the same time period in 2024. Despite the rise in coal use, the fraction of carbon-free electricity held steady year over year, with wind/solar/hydro/nuclear accounting for 43 percent of all power put on the US grid. That occurred despite small drops in nuclear and hydro production.
Posted by BeauHD from Slashdot
From the bigly-changes department: Longtime Slashdot reader sinij shares a press release from the White House, outlining a series of executive orders that overhaul the Nuclear Regulatory Commission and speed up deployment of new nuclear power reactions in the U.S.. From a report: The NRC is a 50-year-old, independent agency that regulates the nation's fleet of nuclear reactors. Trump's orders call for a "total and complete reform" of the agency, a senior White House official told reporters in a briefing. Under the new rules, the commission will be forced to decide on nuclear reactor licenses within 18 months. Trump said Friday the orders focus on small, advanced reactors that are viewed by many in the industry as the future. But the president also said his administration supports building large plants. "We're also talking about the big plants -- the very, very big, the biggest," Trump said. "We're going to be doing them also."
When asked whether NRC reform will result in staff reductions, the White House official said "there will be turnover and changes in roles." "Total reduction in staff is undetermined at this point, but the executive orders do call for a substantial reorganization" of the agency, the official said. The orders, however, will not remove or replace any of the five commissioners who lead the body, according to the White House. Any reduction in staff at the NRC would come at time when the commission faces a heavy workload. The agency is currently reviewing whether two mothballed nuclear plants, Palisades in Michigan and Three Mile Island in Pennsylvania, should restart operations, a historic and unprecedented process. [...]
< This article continues on their website >
Java Turns 30 2025-05-23 15:30:01
Posted by BeauHD from Slashdot
From the enterprise-never-looked-back department: Richard Speed writes via The Register: It was 30 years ago when the first public release of the Java programming language introduced the world to Write Once, Run Anywhere -- and showed devs something cuddlier than C and C++. Originally called "Oak," Java was designed in the early 1990s by James Gosling at Sun Microsystems. Initially aimed at digital devices, its focus soon shifted to another platform that was pretty new at the time -- the World Wide Web.
The language, which has some similarities to C and C++, usually compiles to a bytecode that can, in theory, run on any Java Virtual Machine (JVM). The intention was to allow programmers to Write Once Run Anywhere (WORA) although subtle differences in JVM implementations meant that dream didn't always play out in reality. This reporter once worked with a witty colleague who described the system as Write Once Test Everywhere, as yet another unexpected wrinkle in a JVM caused their application to behave unpredictably. However, the language soon became wildly popular, rapidly becoming the backbone of many enterprises. [...]
However, the platform's ubiquity has meant that alternatives exist to Oracle Java, and the language's popularity is undiminished by so-called "predatory licensing tactics." Over 30 years, Java has moved from an upstart new language to something enterprises have come to depend on. Yes, it may not have the shiny baubles demanded by the AI applications of today, but it continues to be the foundation for much of today's modern software development. A thriving ecosystem and a vast community of enthusiasts mean that Java remains more than relevant as it heads into its fourth decade.
Posted by BeauHD from Slashdot
From the worse-for-everyone department: Google's new AI Mode for Search, which is rolling out to everyone in the U.S., has sparked outrage among publishers, who call it "the definition of theft" for using content without fair compensation and without offering a true opt-out option. Internal documents revealed by Bloomberg earlier this week suggest that Google considered giving publishers more control over how their content is used in AI-generated results but ultimately decided against it, prioritizing product functionality over publisher protections.
News/Media Alliance slammed Google for "further depriving publishers of original content both traffic and revenue." Their full statement reads: "Links were the last redeeming quality of search that gave publishers traffic and revenue. Now Google just takes content by force and uses it with no return, the definition of theft. The DOJ remedies must address this to prevent continued domination of the internet by one company." 9to5Google's take: It's not hard to see why Google went the route that it did here. Giving publishers the ability to opt out of AI products while still benefiting from Search would ultimately make Google's flashy new tools useless if enough sites made the switch. It was very much a move in the interest of building a better product.
Does that change anything regarding how Google's AI products in Search cause potential harm to the publishing industry? Nope.
Google's tools continue to serve the company and its users (mostly) well, but as they continue to bleed publishers dry, those publishers are on the verge of vanishing or, arguably worse, turning to cheap and poorly produced content just to get enough views to survive. This is a problem Google needs to address, as it's making the internet as a whole worse for everyone.