Feed aggregator

Anthropic Accuses Chinese Companies of Siphoning Data From Claude

Slashdot.org - 1 hour 51 min ago
U.S. artificial-intelligence startup Anthropic said three Chinese AI companies set up more than 24,000 fraudulent accounts with its Claude AI model to help their own systems catch up. From a report: The three companies -- DeepSeek, Moonshot AI and MiniMax -- prompted Claude more than 16 million times, siphoning information from Anthropic's system to train and improve their own products, Anthropic said in a blog post Monday. Earlier this month, an Anthropic rival, OpenAI, sent a memo to House lawmakers accusing DeepSeek of using the same tactic, called distillation, to mimic OpenAI's products. Anthropic said distillation had legitimate uses -- companies use it to build smaller versions of their own products, for example -- but it could also be used to build competitive products "in a fraction of the time, and at a fraction of the cost." The scale of the different companies' distillation activity varied. DeepSeek engaged in 150,000 interactions with Claude, whereas Moonshot and MiniMax had more than 3.4 million and 13 million, respectively, Anthropic said.

Read more of this story at Slashdot.

Say Goodbye to the Undersea Cable That Made the Global Internet Possible

Slashdot.org - 2 hours 36 min ago
The first fiber-optic cable ever laid across an ocean -- TAT-8, a nearly 6,000-kilometer line between the United States, United Kingdom, and France that carried its first traffic on December 14, 1988 -- is now being pulled off the Atlantic seabed after more than two decades of sitting dormant, bound for recycling in South Africa. Subsea Environmental Services, one of only three companies in the world whose entire business is cable recovery and recycling, began the operation last year using its new diesel-electric vessel, the MV Maasvliet, and had already brought 1,012 kilometers of the cable to the Portuguese port of Leixoes by August. TAT-8, short for Trans-Atlantic Telephone 8, was built by AT&T, British Telecom, and France Telecom, and hit full capacity within just 18 months of going live. A fault too expensive to repair took it out of service in 2002. The recovered cable is being shipped to Mertech Marine in South Africa, where it will be broken down into steel, copper, and two types of polyethylene -- all commercially valuable, especially the high-quality copper at a time when the International Energy Agency projects global shortages within a decade.

Read more of this story at Slashdot.

PayPal Attracts Takeover Interest After Stock Slump

Slashdot.org - 3 hours 13 min ago
An anonymous reader shares a report: PayPal, the digital payments pioneer, is attracting takeover interest from potential buyers after a stock slide wiped out almost half of its value, according to people familiar with the matter. The San Jose, California-based company has fielded meetings with banks amid unsolicited interest from suitors, the people said. At least one large rival is looking at the whole company, while some other suitors are only interested in certain PayPal assets, the people said, asking not to be identified because the information is private. Buyer interest in PayPal is still at a preliminary stage and may not lead to a transaction, the people cautioned. Founded in the late 1990s, PayPal was an early mover in the world of digital payments. But the company now finds itself in a rut with its customers increasingly turning to alternative ways to pay for things. PayPal's shares have fallen around 46% in New York trading over the last 12 months, giving the company a market value of about $38.4 billion.

Read more of this story at Slashdot.

Climate Physicists Face the Ghosts in Their Machines: Clouds

Slashdot.org - 3 hours 59 min ago
Climate scientists trying to predict how much hotter the planet will get have long grappled with a surprisingly stubborn problem -- clouds, which both reflect sunlight and trap heat, account for more than half the variation between climate predictions and are the main reason warming projections for the next 50 years range from 2 to 6 degrees Celsius. Two research groups are now racing to close that gap using AI, though they disagree sharply on method. Tapio Schneider at Caltech built CLIMA, a model that uses machine learning to optimize cloud parameters within traditional physics equations; it will be unveiled at a conference in Japan in March. Chris Bretherton at the Allen Institute for AI took a different path -- his ACE2 neural network, released in 2024, learns from 50 years of atmospheric data and largely bypasses physics equations altogether.

Read more of this story at Slashdot.

Stressful People in Your Life Could Be Adding Months To Your Biological Age

Slashdot.org - 4 hours 39 min ago
A study published last week in PNAS found that people who regularly cause problems or make life difficult -- whom the researchers call "hasslers" -- are associated with measurably faster biological aging in those around them, at a rate of roughly 1.5% per additional hassler and about nine months of additional biological age relative to same-age peers. The research drew on DNA methylation-based epigenetic clocks and ego-centric network data from a state-representative probability sample of 2,345 adults in Indiana, aged 18 to 103. Nearly 29% of respondents reported at least one hassler in their close network. The biological toll varied by relationship type: hasslers who were family members showed the strongest and most consistent associations with accelerated aging, while spouse hasslers showed no significant effect on either epigenetic measure. The damage also went beyond aging clocks -- each additional hassler was associated with greater depression and anxiety severity, higher BMI, increased inflammation, and higher multimorbidity. When benchmarked against smoking, a major behavioral risk factor for aging, the hassler effect corresponded to roughly 13 to 17% of smoking's estimated impact on the same aging clocks.

Read more of this story at Slashdot.

Sam Altman Would Like To Remind You That Humans Use a Lot of Energy, Too

Slashdot.org - 5 hours 21 min ago
OpenAI CEO Sam Altman is pushing back on growing concerns about AI's environmental footprint, dismissing claims about ChatGPT's water consumption as "totally fake" and arguing that the fairer way to measure AI's energy use is to compare it against humans. In an interview with Indian Express, Altman acknowledged that evaporative cooling in data centers once made water usage a real concern but said that is no longer the case, calling internet claims of 17 gallons of water per query "completely untrue, totally insane, no connection to reality." On energy, he conceded it is "fair" to worry about total consumption given how heavily the world now relies on AI, and called for a rapid shift toward nuclear, wind and solar power. He took particular issue with comparisons that pit the cost of training a model against a single human inference, noting it "takes like 20 years of life and all of the food you eat" before a person gets smart -- and that on a per-query basis, AI has "probably already caught up on an energy efficiency basis."

Read more of this story at Slashdot.

Goldman Sachs, Morgan Stanley Calculate AI's Contribution To U.S. Growth May Be Basically Zero

Slashdot.org - 6 hours 1 min ago
The narrative that AI spending has been singlehandedly propping up the U.S. economy -- a claim that captivated Silicon Valley, Wall Street and Washington over the past year -- is facing serious pushback from economists [non-paywalled source] at Goldman Sachs, Morgan Stanley and JPMorgan Chase, all of whom now calculate that the AI buildup's direct contribution to growth was dramatically overstated and possibly close to zero. The debate hinges on how GDP accounts for imported components: roughly three-quarters of AI data center costs go toward computer chips and gear largely manufactured in Asia, and that spending gets subtracted from domestic output because it boosts foreign economies. Joseph Politano of the Apricitas Economics newsletter pegs AI's actual contribution at about 0.2 percentage points of the 2.2 percent U.S. growth in 2025, and even Hannah Rubinton at the St. Louis Fed -- whose own analysis attributed 39 percent of growth to AI-related business spending through the first nine months of the year -- acknowledges that figure is probably the ceiling. "It's not like AI is propping up the economy," Rubinton said.

Read more of this story at Slashdot.

Is AI Impacting Which Programming Language Projects Use?

Slashdot.org - 7 hours 27 min ago
"In August 2025, TypeScript surpassed both Python and JavaScript to become the most-used language on GitHub for the first time ever..." writes GitHub's senior developer advocate. They point to this as proof that "AI isn't just speeding up coding. It's reshaping which languages, frameworks, and tools developers choose in the first place." Eighty percent of new developers on GitHub use Copilot within their first week. Those early exposures reset the baseline for what "easy" means. When AI handles boilerplate and error-prone syntax, the penalty for choosing powerful but complex languages disappears. Developers stop avoiding tools with high overhead and start picking based on utility instead. The language adoption data shows this behavioral shift: — TypeScript grew 66% year-over-year — JavaScript grew 24% — Shell scripting usage in AI-generated projects jumped 206% That last one matters. We didn't suddenly love Bash. AI absorbed the friction that made shell scripting painful. So now we use the right tool for the job without the usual cost. "When a task or process goes smoothly, your brain remembers," they point out. "Convenience captures attention. Reduced friction becomes a preference — and preferences at scale can shift ecosystems." "AI performs better with strongly typed languages. Strongly typed languages give AI much clearer constraints..." "Standardize before you scale. Document patterns. Publish template repositories. Make your architectural decisions explicit. AI tools will mirror whatever structures they see." "Test AI-generated code harder, not less."

Read more of this story at Slashdot.

Our commitment to make AI training available to all 6 million U.S. educatorsOur commitment to make AI training available to all 6 million U.S. educatorsVP & General Manager, Education, Google

GoogleBlog - 8 hours 1 min ago
Learn about Google’s partnership to bring AI skills to every classroom with free training for 6 million U.S. educators.Learn about Google’s partnership to bring AI skills to every classroom with free training for 6 million U.S. educators.
Categories: Technology

Rule-Breaking Black Hole Growing At 13x the Cosmic 'Speed Limit' Challenges Theories

Slashdot.org - 11 hours 27 min ago
"A surprisingly ravenous black hole from the dawn of the universe is breaking two big rules," reports Live Science. "It's not only exceeding the 'speed limit' of black hole growth but also generating extreme X-ray and radio wave emissions — two features that are not predicted to coexist..." "How is this rule-breaking behavior even possible? In a paper published Jan. 21 in The Astrophysical Journal, an international team of researchers observed ID830 in multiple wavelengths to find an answer...." As they attract gas and dust, this material accumulates in a swirling accretion disk. Gravity pulls the material from the disk into the black hole, but the infalling material generates radiation pressure that pushes outward and prevents more stuff from falling in. As a result, black holes are muzzled by a self-regulating process called the Eddington limit... Its X-ray brightness suggests that ID830 is accreting mass at about 13 times the Eddington limit, due to a sudden burst of inflowing gas that may have occurred as ID830 shredded and engulfed a celestial body that wandered too close. "For a supermassive black hole (SMBH) as massive as ID830, this would require not a normal (main-sequence) star, but a more massive giant star or a huge gas cloud," study co-author Sakiko Obuchi, an observational astronomer at Waseda University in Tokyo, told Live Science via email. Such super-Eddington phases may be incredibly brief, as "this transitional phase is expected to last for roughly 300 years," Obuchi added. ID830 also simultaneously displays radio and X-ray emissions. These two features are not expected to coexist, especially because super-Eddington accretion is thought to suppress such emissions. "This unexpected combination hints at physical mechanisms not yet fully captured by current models of extreme accretion and jet launching," the researchers said in a statement. So while ID830 is launching massive radio jets, its X-ray emissions appear to originate from a structure called a corona, produced as intense magnetic fields from the accretion disk create a thin but turbulent billion-degree cloud of turbocharged particles. These particles orbit the black hole at nearly the speed of light, in what NASA calls "one of the most extreme physical environments in the universe." Altogether, ID830's rule-breaking behaviors suggest that it is in a rare transitional phase of excessive consumption — and excretion. This incredible feeding burst has energized both its jets and its corona, making ID830 shine brightly across multiple wavelengths as it spews out excess radiation. Additionally, based on UV-brightness analysis, quasars like ID830 may be unexpectedly common, the researchers said. Models predict that only around 10% of quasars have spectacular radio jets, but these energetic objects could be significantly more abundant in the early universe than previously suggested. Most importantly, ID830 also shows how SMBHs can regulate galaxy growth in the early universe. As a black hole gobbles matter at the super-Eddington limit, the energy from its resultant emissions can heat and disperse matter throughout the interstellar medium — the gas between stars — to suppress star formation. As a result, ancient SMBHs like ID830 may have grown massive at the expense of their host galaxies.

Read more of this story at Slashdot.

Download of the day: GIMP 3.0 is FINALLY Here!

nixCraft - 13 hours 25 min ago
Wow! After years of hard work and countless commits, we have finally reached a huge milestone: GIMP 3.0 is officially released! I am excited as I write this and can't wait to share some incredible new features and improvements in this release. GIMP 2.10 was released in 2018, and the first development version of GIMP 3.0 came out in 2020. GIMP 3.0 released on 16/March/2025. Let us explore how to download and install GIMP 3.0, as well as the new features in this version. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post Download of the day: GIMP 3.0 is FINALLY Here! appeared first on nixCraft. 2025-03-18T03:45:26Z 2025-03-18T03:45:26Z Vivek Gite

How to list upgradeable packages on FreeBSD using pkg

nixCraft - 13 hours 25 min ago
Here is a quick list of all upgradeable packages on FreeBSD using pkg command. This is equivalent to apt list --upgradable command on my Debian or Ubuntu Linux system. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post How to list upgradeable packages on FreeBSD using pkg appeared first on nixCraft. 2025-03-16T20:25:39Z 2025-03-16T20:25:39Z Vivek Gite

Ubuntu to Explore Rust-Based “uutils” as Potential GNU Core Utilities Replacement

nixCraft - 13 hours 25 min ago
In a move that has sparked significant discussion within the Ubuntu Linux fan-base and community, Canonical, the company behind Ubuntu, has announced its intention to explore the potential replacement of GNU Core Utilities with the Rust-based "uutils" project. They plan to introduce new changes in Ubuntu Linux 25.10, eventually changing it to Ubuntu version 26.04 LTS release in 2026 as Ubuntu is testing Rust 'uutils' to overhaul its core utilities potentially. Let us find out the pros and cons and what this means for you as an Ubuntu Linux user, IT pro, or developer. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post Ubuntu to Explore Rust-Based “uutils” as Potential GNU Core Utilities Replacement appeared first on nixCraft. 2025-03-16T12:17:36Z 2025-03-16T12:17:36Z Vivek Gite

How to install KSH on FreeBSD

nixCraft - 13 hours 25 min ago
Installing KSH (KornShell) on FreeBSD can be done with either FreeBSD ports or the pkg command. The ports collection will download the KSH source code, compile it, and install it on the system. The pkg method is easier, and it will download a pre-compiled binary package. Hence, it is recommended for all users. KornShell (KSH) has a long history, and many older Unix systems and scripts rely on it. As a result, KSH remains relevant for maintaining and supporting legacy infrastructure. Large enterprises, especially those with established Unix-based systems, continue to use KSH for scripting and system administration tasks. Some industries where KSH is still commonly used include finance and telecommunications. While Bash has become the dominant shell in many Linux distributions, KSH still holds a significant presence in Unix-like environments, particularly in legacy systems. Therefore, installing KSH and practicing with it is worthwhile if you plan to work in such environments. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post How to install KSH on FreeBSD appeared first on nixCraft. 2025-03-03T23:50:59Z 2025-03-03T23:50:59Z Vivek Gite

Linux Sed Tutorial: Learn Text Editing with Syntax & Examples

nixCraft - 13 hours 25 min ago
Sed is an acronym for "stream editor." A stream refers to a source or destination for bytes. In other words, sed can read its input from standard input (stdin), apply the specified edits to the stream, and automatically output the results to standard output (stdout). Sed syntax allows an input file to be specified on the command line. However, the syntax does not directly support output file specification; this can be achieved through output redirection or editing files in place while making a backup of the original copy optionally. Sed is one of the most powerful tools on Linux and Unix-like systems. Learning it is worthwhile, so in this tutorial, we will start with the sed command syntax and examples. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post Linux Sed Tutorial: Learn Text Editing with Syntax & Examples appeared first on nixCraft. 2025-03-03T09:47:07Z 2025-03-03T09:47:07Z Vivek Gite

How to tell if FreeBSD needs a Reboot using kernel version check

nixCraft - 13 hours 25 min ago
Keeping your FreeBSD server or workstation updated is crucial for security and stability. However, after applying updates, especially kernel updates, you might wonder, "Do I need to reboot my system?" Let's simplify this process and provide a straightforward method for determining whether a reboot is necessary using the CLI, shell script, and ansible playbook. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post How to tell if FreeBSD needs a Reboot using kernel version check appeared first on nixCraft. 2025-02-23T22:07:23Z 2025-02-23T22:07:23Z Vivek Gite

Critical Rsync Vulnerability Requires Immediate Patching on Linux and Unix systems

nixCraft - 13 hours 25 min ago
Rsync is a opensource command-line tool in Linux, macOS, *BSD and Unix-like systems that synchronizes files and directories. It is a popular tool for sending or receiving files, making backups, or setting up mirrors. It minimizes data copied by transferring only the changed parts of files, making it faster and more bandwidth-efficient than traditional copying methods provided by tools like sftp or ftp-ssl. Rsync versions 3.3.0 and below has been found with SIX serious vulnerabilities. Attackers could exploit these to leak your data, corrupt your files, or even take over your system. There is a heap-based buffer overflow with a CVSS score of 9.8 that needs to be addressed on both the client and server sides of rsync package. Apart from that info leak via uninitialized stack contents defeats ASLR protection and rsync server can make client write files outside of destination directory using symbolic links. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post Critical Rsync Vulnerability Requires Immediate Patching on Linux and Unix systems appeared first on nixCraft. 2025-01-15T18:04:24Z 2025-01-15T18:04:24Z Vivek Gite

How to control the SSH multiplexing with the control commands

nixCraft - 13 hours 25 min ago
Multiplexing will boost your SSH connectivity or speed by reusing existing TCP connections to a remote host. This is useful when you frequently connect to the same server using SSH protocol for remote login, server management, using IT automation tools over SSH or even running hourly backups. However, sometimes your SSH command (client) will not respond or get hung up on the session when using multiplexing. Typically, this happens when your public IP changes (IPv4 to IPv6 changes when using DNS names), VPN issues, or firewall cuts connections. Hence, knowing SSH client control commands can save you time and boost your productivity when such gotchas occur. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post How to control the SSH multiplexing with the control commands appeared first on nixCraft. 2025-01-15T08:29:10Z 2025-01-15T08:29:10Z Vivek Gite

ZFS Raidz Expansion Finally, Here in version 2.3.0

nixCraft - 13 hours 25 min ago
After years of development and testing, the ZFS raidz expansion is finally here and has been released as part of version 2.3.0. ZFS is a popular file system for Linux and FreeBSD. RAIDz is like RAID 5, which you find with hardware or Linux software raid devices. It protects your data by spreading it across multiple hard disks along with parity information. A raidz device can have single, double, or triple parity to sustain one, two, or three hard disk failures, respectively, without losing any data. Hence, expanding or adding a new HDD is a very handy feature for sysadmins in today's data-sensitive apps. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post ZFS Raidz Expansion Finally, Here in version 2.3.0 appeared first on nixCraft. 2025-01-14T09:19:20Z 2025-01-14T09:19:20Z Vivek Gite

How to run Docker inside Incus containers

nixCraft - 13 hours 25 min ago
Incus and Docker both use Linux kernel features to containerize your applications. Incus is best suited when you need system-level containers that act like traditional VMs and provide a persistent developer experience. On the other hand, Docker containers are ephemeral, i.e., temporary in nature. All files created inside Docker containers are lost when your Docker container is stopped or removed unless you stored them using volumes in different directories outside Docker. Docker is created as a disposable app deployment system. Incus containers are not typically created as disposables, and data is kept inside when they are stopped. Because of the Linux kernel support nesting feature, you can run Docker inside Incus. This page explains how to run Docker inside Incus containers. Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit The post How to run Docker inside Incus containers appeared first on nixCraft. 2024-12-18T05:44:26Z 2024-12-18T05:44:26Z Vivek Gite

Syndicate content
Comment