Feed aggregator
How to find hidden processes and ports on Linux/Unix/Windows
Unhide is a little handy forensic tool to find hidden processes and TCP/UDP ports by rootkits / LKMs or by another hidden technique. This tool works under Linux, Unix-like system, and MS-Windows operating systems.
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post How to find hidden processes and ports on Linux/Unix/Windows appeared first on nixCraft.
2024-05-07T05:05:51Z
2024-05-07T05:05:51Z
Vivek Gite
How to add bash auto completion in Debian Linux
Bash is a command language interpreter compatible with sh. It can execute commands read from a file or keyboard. On Debian Linux, bash-completion is a set of shell functions that uses Bash's programmable completion feature. This page provides instructions on installing and enabling Bash auto-completion on Debian Linux versions 10, 11, and 12 to increase productivity by writing custom bash code.
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post How to add bash auto completion in Debian Linux appeared first on nixCraft.
2024-05-06T15:51:25Z
2024-05-06T15:51:25Z
Vivek Gite
How to add cron job entry for acme.sh
Recently, I had a learning experience with cron jobs and acme.sh. acme.sh is an excellent tool that simplifies the management of Let's Encrypt TLS (SSL) certificates. It makes obtaining and renewing these essential security certificates for your web server easier.
Recently, I moved my server from Linode to AWS, which was a new environment for me. Initially, everything appeared to be working correctly, and I assumed everything was running smoothly. However, I forgot to migrate the cron job that acme.sh uses to renew the certificate automatically.
This oversight caused my Let's Encrypt certificates to expire, resulting in security warnings and potential disruptions for visitors to my website. Opps!
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post How to add cron job entry for acme.sh appeared first on nixCraft.
2024-05-03T06:43:12Z
2024-05-03T06:43:12Z
Vivek Gite
How to Upgrade Ubuntu 22.04 to 24.04 LTS: A Complete Guide
{nixCraft Patreon supporters content}Below is a sneak peek of this content! Ubuntu 24.04 LTS (Noble Numbat) was launched on April 25th, 2024. This new version will be supported for five years until June 2029. The armhf architecture now provides support for the Year 2038 problem. The upgrades include significant updates to core packages like Linux kernel, systemd, Netplan, […]The post How to Upgrade Ubuntu 22.04 to 24.04 LTS: A Complete Guide appeared first on Opensource Flare✨.
2024-04-26T18:25:08Z
2024-04-26T18:25:08Z
Vivek Gite
How to Upgrade Ubuntu 22.04 to 24.04 LTS: A Complete Guide
Ubuntu 24.04 LTS (Noble Numbat) was launched on April 25th, 2024. This new version will be supported for five years until June 2029. The armhf architecture now provides support for the Year 2038 problem. The upgrades include significant updates to core packages like Linux kernel, systemd, Netplan, toolchain upgrades for better development support, enhanced security measures, and performance optimizations. It also has an updated GNOME desktop environment and other default applications. Let us see how to upgrade Ubuntu 22.04 LTS to Ubuntu 24.04 LTS using the CLI over ssh-based session.
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post How to Upgrade Ubuntu 22.04 to 24.04 LTS: A Complete Guide appeared first on nixCraft.
2024-04-26T08:33:21Z
2024-04-26T08:33:21Z
Vivek Gite
How to configure AWS SES with Postfix MTA on Debian Linux
AWS SES (Amazon Simple Email Service) is a cloud-based email-sending service that is both reliable and cost-effective. This service is offered by Amazon Web Services. Postfix is a popular email server for Debian and Unix-like systems. It is an open-source Mail Transfer Agent (MTA) responsible for routing and delivering emails. Debian Linux is a widely used Linux distribution known for its stability and user-friendliness for server usage. Let us see how to integrate AWS SES with the Postfix MTA on Debian Linux version 11/12.
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post How to configure AWS SES with Postfix MTA on Debian Linux appeared first on nixCraft.
2024-04-19T07:04:06Z
2024-04-19T07:04:06Z
Vivek Gite
The repository ‘http://deb.debian.org/debian buster-backports Release’ no longer has a Release file.
When you run the sudo apt update, you may see the following message or error on a Debian Linux:
Err:5 http://deb.debian.org/debian buster-backports Release
404 Not Found [IP: 146.75.34.132 80]
Reading package lists... Done
E: The repository 'http://deb.debian.org/debian buster-backports Release' no longer has a Release file.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.
Here is how to fix this issue.
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post The repository ‘http://deb.debian.org/debian buster-backports Release’ no longer has a Release file. appeared first on nixCraft.
2024-04-14T20:42:01Z
2024-04-14T20:42:01Z
Vivek Gite
How do I find out my timezone in Linux?
You can find the timezone in Linux using the command line. The easiest way to do this is to type the "timedatectl" command and look for the "timezone" line when using modern Linux distros with systemd. There are other commands and ways to temporarily switch to a new timezone for date calculations.
Love this? sudo share_on: Twitter - Facebook - LinkedIn - Whatsapp - Reddit
The post How do I find out my timezone in Linux? appeared first on nixCraft.
2024-04-06T01:06:44Z
2024-04-06T01:06:44Z
Vivek Gite
Sound Blaster Re:Imagine: Modular Linux Audio Hub for Gamers and Creators - WebProNews
Categories: Linux
GitHub Announces 'Agent HQ', Letting Copilot Subscribers Run and Manage Coding Agents from Multiple Vendors
"AI isn't just a tool anymore; it's an integral part of the development experience," argues GitHub's blog. So "Agents shouldn't be bolted on. They should work the way you already work..."
So this week GitHub announced "Agent HQ," which CNBC describes as a "mission control" interface "that will allow software developers to manage coding agents from multiple vendors on a single platform."
Developers have a range of new capabilities at their fingertips because of these agents, but it can require a lot of effort to keep track of them all individually, said GitHub COO Kyle Daigle. Developers will now be able to manage agents from GitHub, OpenAI, Google, Anthropic, xAI and Cognition in one place with Agent HQ. "We want to bring a little bit of order to the chaos of innovation," Daigle told CNBC in an interview. "With so many different agents, there's so many different ways of kicking off these asynchronous tasks, and so our big opportunity here is to bring this all together." Agent HQ users will be able to access a command center where they can assign, steer and monitor the work of multiple agents...
The third-party agents will begin rolling out to GitHub Copilot subscribers in the coming months, but Copilot Pro+ users will be able to access OpenAI Codex in VS Code Insiders this week, the company said.
"We're into this wave two era," GitHub's COO Mario Rodriguez told VentureBeat, an era that's "going to be multimodal, it's going to be agentic and it's going to have these new experiences that will feel AI native...."
Or, as VentureBeat sees it, GitHub "is positioning itself as the essential orchestration layer beneath them all..."
Just as the company transformed Git, pull requests and CI/CD into collaborative workflows, it's now trying to do the same with a fragmented AI coding landscape...
The technical architecture addresses a critical enterprise concern: Security. Unlike standalone agent implementations where users must grant broad repository access, GitHub's Agent HQ implements granular controls at the platform level... Agents operating through Agent HQ can only commit to designated branches. They run within sandboxed GitHub Actions environments with firewall protections. They operate under strict identity controls. [GitHub COO] Rodriguez explained that even if an agent goes rogue, the firewall prevents it from accessing external networks or exfiltrating data unless those protections are explicitly disabled.
Beyond managing third-party agents, GitHub is introducing two technical capabilities that set Agent HQ apart from alternative approaches like Cursor's standalone editor or Anthropic's Claude integration. Custom agents via AGENTS.md files: Enterprises can now create source-controlled configuration files that define specific rules, tools and guardrails for how Copilot behaves. For example, a company could specify "prefer this logger" or "use table-driven tests for all handlers." This permanently encodes organizational standards without requiring developers to re-prompt every time... Native Model Context Protocol (MCP) support: VS Code now includes a GitHub MCP Registry. Developers can discover, install and enable MCP servers with a single click. They can then create custom agents that combine these tools with specific system prompts. This positions GitHub as the integration point between the emerging MCP ecosystem and actual developer workflows. MCP, introduced by Anthropic but rapidly gaining industry support, is becoming a de facto standard for agent-to-tool communication. By supporting the full specification, GitHub can orchestrate agents that need access to external services without each agent implementing its own integration logic.
GitHub is also shipping new capabilities within VS Code itself. Plan Mode allows developers to collaborate with Copilot on building step-by-step project approaches. The AI asks clarifying questions before any code is written. Once approved, the plan can be executed either locally in VS Code or by cloud-based agents. The feature addresses a common failure mode in AI coding: Beginning implementation before requirements are fully understood. By forcing an explicit planning phase, GitHub aims to reduce wasted effort and improve output quality.
More significantly, GitHub's code review feature is becoming agentic. The new implementation will use GitHub's CodeQL engine, which previously largely focused on security vulnerabilities to identify bugs and maintainability issues. The code review agent will automatically scan agent-generated pull requests before human review. This creates a two-stage quality gate.
"Don't let this little bit of news float past you like all those self-satisfied marketing pitches we semi-hear and ignore," writes ZDNet:
If it works and remains reliable, this is actually a very big deal... Tech companies, especially the giant ones, often like to talk "open" but then do their level best to engineer lock-in to their solution and their solution alone. Sure, most of them offer some sort of export tool, but the barrier to moving from one tool to another is often huge... [T]he idea that you can continue to use your favorite agent or agents in GitHub, fully integrated into the GitHub tool path, is powerful. It means there's a chance developers might not have to suffer the walled garden effect that so many companies have strived for to lock in their customers.
Read more of this story at Slashdot.
Linux breaches 3% of Steam's userbase for the first time, and no, it's not all SteamOS's doing - XDA
Categories: Linux
Is OpenAI Becoming 'Too Big to Fail'?
OpenAI "hasn't yet turned a profit," notes Wall Street Journal business columnist Tim Higgins. "Its annual revenue is 2% of Amazon.com's sales.
"Its future is uncertain beyond the hope of ushering in a godlike artificial intelligence that might help cure cancer and transform work and life as we know it. Still, it is brimming with hope and excitement.
"But what if OpenAI fails?"
There's real concern that through many complicated and murky tech deals aimed at bolstering OpenAI's finances, the startup has become too big to fail. Or, put another way, if the hype and hope around Chief Executive Sam Altman's vision of the AI future fails to materialize, it could create systemic risk to the part of the U.S. economy likely keeping us out of recession.
That's rarefied air, especially for a startup. Few worried about what would happen if Pets.com failed in the dot-com boom. We saw in 2008-09 with the bank rescues and the Chrysler and General Motors bailouts what happens in the U.S. when certain companies become too big to fail...
[A]fter a lengthy effort to reorganize itself, OpenAI announced moves that will allow it to have a simpler corporate structure. This will help it to raise money from private investors and, presumably, become a publicly traded company one day. Already, some are talking about how OpenAI might be the first trillion-dollar initial public offering... Nobody is saying OpenAI is dabbling in anything like liar loans or subprime mortgages. But the startup is engaging in complex deals with the key tech-industry pillars, the sorts of companies making the guts of the AI computing revolution, such as chips and Ethernet cables. Those companies, including Nvidia and Oracle, are partnering with OpenAI, which in turn is committing to make big purchases in coming years as part of its growth ambitions.
Supporters would argue it is just savvy dealmaking. A company like Nvidia, for example, is putting money into a market-making startup while OpenAI is using the lofty value of its private equity to acquire physical assets... They're rooting for OpenAI as a once-in-a-generational chance to unseat the winners of the last tech cycles. After all, for some, OpenAI is the next Apple, Facebook, Google and Tesla wrapped up in one. It is akin to a company with limitless potential to disrupt the smartphone market, create its own social-media network, replace the search engine, usher in a robot future and reshape nearly every business and industry.... To others, however, OpenAI is something akin to tulip mania, the harbinger of the Great Depression, or the next dot-com bubble. Or worse, they see, a jobs killer and mad scientist intent on making Frankenstein.
But that's counting on OpenAI's success.
Read more of this story at Slashdot.
Sound Blaster Crowdfunds Linux-Powered Audio Hub 'Re:Imagine' For Creators and Gamers
Slashdot reader BrianFagioli summarizes some news from Nerds.xyz: Creative Technology has launched Sound Blaster Re:Imagine, a modular, Linux-powered audio hub that reimagines the classic PC sound card for the modern age. The device acts as both a high-end digital-to-analog converter (DAC) and a customizable control deck that connects PCs, consoles, phones, and tablets in one setup.
Users can instantly switch inputs and outputs, while developers get full hardware access through an SDK for creating their own apps. It even supports AI-driven features like an on-device DJ, a revived "Dr. Sbaitso" speech synthesizer, and a built-in DOS emulator for retro gaming.
The Kickstarter campaign has already raised more than $150,000, far surpassing its initial goal of $15,000 with over 50 days remaining. Each unit ships with a modular "Horizon" base and swappable knobs, sliders, and buttons, while a larger "Vertex" version will unlock at a higher funding milestone.
Running an unspecified Linux build, Re:Imagine positions itself as both a nostalgic nod to Sound Blaster's roots and a new open platform for creators, gamers, and tinkerers.
Read more of this story at Slashdot.
GoFundMe Created 1.4 Million Donation Pages for Nonprofits Without Their Consent
San Francisco's local newscast ABC7 runs a consumer advocacy segment called "7 on Your Side". They received a disturbing call for help from Dave Dornlas, treasurer of a nonprofit supporting a local library:
GoFundMe has taken upon itself to create "nonprofit pages" for 1.4 million 501C-3 organizations using public IRS data along with information from trusted partners like the PayPal Giving Fund. "The fact that they would just on their own build pages for nonprofits that they've never spoken to is a problem," [Dornlas] said. "I'm a believer in opt-in, not opt-out...." Dornlas says he struggled to find anyone to contact from GoFundMe about this...
Dave's other frustration is tied to the company's optional tipping feature on the platform. "GoFundMe also solicits a tip of 14.5%. In other words, 'We're doing this and we're great people. Give us 14.5% to do this' — which doesn't have to happen," Dornlas said. "That's what bothers me." When 7 On Your Side checked, the optional tip was actually set for 16.5%. The consumer is required to move the bar to adjust accordingly... The tip would be in addition to the 2.2% transaction fee GoFundMe charges nonprofits, plus $0.30 per donation. That fee goes up to 2.9% for individual fundraisers.
Now both GoFundMe pages of Dornlas's nonprofits have been removed from the site. Any organization can do so, by clicking "unpublish" on the platform.
But GoFundMe's move drew strong criticism from the Center for Nonprofit Excellence (a Kentucky-based membership organization with over 500 members). GoFundMe's move, they say, creates "confusion for donors and supporters who are unsure of the legitimacy of the fundraising pages. In some cases, GoFundMe included incorrect information, outdated logos, and other inaccuracies that compromise and misrepresent nonprofits' brand, mission, strategy, and message."
And GoFundMe's processing fees and tips "ultimately result in fewer resources for nonprofits than if donors contributed directly through the organization." But there's more...
GoFundMe has initiated SEO optimization as the default for the donation pages to improve their visibility when individuals search forinformation about nonprofits online. This could result in GoFundMe'spages ranking higher than the nonprofit's own website, pulling away potential donors and supporters...
Without adequate safeguards in place, nonprofits report serious issues, ranging from unauthorized individuals claiming donations and the inability to remove pages without first agreeing to GoFundMe's terms and conditions or sharing sensitive banking information.
The Center for Nonprofit Excellence has now joined with the National Council of Nonprofits — America's largest network of nonprofits, with over 25,000 members — to officially urge GoFundMe to immediately rectify the situation.
Thanks to long-time Slashdot reader Arrogant-Bastard for sharing the article.
Read more of this story at Slashdot.
Amazon's Deployment of Rivian's Electric Delivery Vans Expand to Canada
"Amazon has deployed Rivian's electric delivery vans in Canada for the first time," reports CleanTechnica, with 50 now deployed in the Vancouver area.
Amazon's director of Global Fleet and Products says there's now over 35,000 electric vans deployed globally — and that they've delivered more than 1.5 billion packages.
More from the blog Teslarati:
In December 2024, the companies announced they had successfully deployed 20,000 EDVs across the U.S. In the first half of this year, 10,000 additional vans were delivered, and Amazon's fleet had grown to 30,000 EDVs by mid-2025. Amazon's fleet of EDVs continues to grow rapidly and has expanded to over 100 cities in the United States... The EDV is a model that is exclusive to Amazon, but Rivian sells the RCV, or Rivian Commercial Van, openly. It detailed some of the pricing and trim options back in January when it confirmed it had secured orders from various companies, including AT&T.
Read more of this story at Slashdot.