Feed aggregator

Amazon Is About To Be Flooded With AI-Generated Video Ads

Slashdot.org - Wed, 06/11/2025 - 16:20
Amazon has launched its AI-powered Video Generator tool in the U.S., allowing sellers to quickly create photorealistic, motion-enhanced video ads often with a single click. "We'll likely see Amazon retailers utilizing AI-generated video ads in the wild now that the tool is generally available in the U.S. and costs nothing to use -- unless the ads are so convincing that we don't notice anything at all," says The Verge. From the report: New capabilities include motion improvements to show items in action, which Amazon says is best for showcasing products like toys, tools, and worn accessories. For example, Video Generator can now create clips that show someone wearing a watch on their wrist and checking the time, instead of simply displaying the watch on a table. The tool generates six different videos to choose from, and allows brands to add their logos to the finished results. The Video Generator can now also make ads with multiple connected scenes that include humans, pets, text overlays, and background music. The editing timeline shown in Amazon's announcement video suggests the ads max out at 21 seconds.. The resulting ads edge closer to the traditional commercials we're used to seeing while watching TV or online content, compared to raw clips generated by video AI tools like OpenAI's Sora or Adobe Firefly. A new video summarization feature can create condensed video ads from existing footage, such as demos, tutorials, and social media content. Amazon says Video Generator will automatically identify and extract key clips to generate new videos formatted for ad campaigns. A one-click image-to-video feature is also available that creates shorter GIF-style clips to show products in action.

Read more of this story at Slashdot.

Hong Kong Bans Video Game Using National Security Laws

Slashdot.org - Wed, 06/11/2025 - 15:40
Hong Kong authorities have invoked national security laws for the first time to ban the Taiwan-made video game Reversed Front: Bonfire, accusing it of promoting "secessionist agendas, such as 'Taiwan independence' and 'Hong Kong independence.'" Engadget reports: Reversed Front: Bonfire was developed by a group known as ESC Taiwan, who are outspoken critics of the China's Communist Party. The game disappeared from the Apple App Store in Hong Kong less than 24 hours after authorities issued the warning. Google already removed the game from the Play Store back in May, because players were using hate speech as part of their usernames. ESC Taiwan told The New York Times that that the game's removal shows that apps like theirs are subject to censorship in mainland China. The group also thanked authorities for the free publicity on Facebook, as the game experienced a surge in Google searches. The game uses anime-style illustrations and allows players to fight against China's Communist Party by taking on the role of "propagandists, patrons, spies or guerrillas" from Hong Kong, Taiwan, Tibet, Mongolia and Xinjiang, which is home to ethnic minorities like the Uyghur. That said, they can also choose to play as government soldiers. In its warning, Hong Kong Police said that anybody who shares or recommends the game on the internet may be committing several offenses, including "incitement to secession, "incitement to subversion" and "offenses in connection with seditious intention." Anybody who has downloaded the game will be considered in "possession of a publication that has a seditious intention," and anybody who provides financial assistance to it will be violating national security laws, as well. "Those who have downloaded the application should uninstall it immediately and must not attempt to defy the law," the authorities wrote.

Read more of this story at Slashdot.

Scientists Built a Badminton-Playing Robot With AI-Powered Skills

Slashdot.org - Wed, 06/11/2025 - 15:00
An anonymous reader quotes a report from Ars Technica: The robot built by [Yuntao Ma and his team at ETH Zurich] was called ANYmal and resembled a miniature giraffe that plays badminton by holding a racket in its teeth. It was a quadruped platform developed by ANYbotics, an ETH Zurich spinoff company that mainly builds robots for the oil and gas industries. "It was an industry-grade robot," Ma said. The robot had elastic actuators in its legs, weighed roughly 50 kilograms, and was half a meter wide and under a meter long. On top of the robot, Ma's team fitted an arm with several degrees of freedom produced by another ETH Zurich spinoff called Duatic. This is what would hold and swing a badminton racket. Shuttlecock tracking and sensing the environment were done with a stereoscopic camera. "We've been working to integrate the hardware for five years," Ma said. Along with the hardware, his team was also working on the robot's brain. State-of-the-art robots usually use model-based control optimization, a time-consuming, sophisticated approach that relies on a mathematical model of the robot's dynamics and environment. "In recent years, though, the approach based on reinforcement learning algorithms became more popular," Ma told Ars. "Instead of building advanced models, we simulated the robot in a simulated world and let it learn to move on its own." In ANYmal's case, this simulated world was a badminton court where its digital alter ego was chasing after shuttlecocks with a racket. The training was divided into repeatable units, each of which required that the robot predict the shuttlecock's trajectory and hit it with a racket six times in a row. During this training, like a true sportsman, the robot also got to know its physical limits and to work around them. The idea behind training the control algorithms was to develop visuo-motor skills similar to human badminton players. The robot was supposed to move around the court, anticipating where the shuttlecock might go next and position its whole body, using all available degrees of freedom, for a swing that would mean a good return. This is why balancing perception and movement played such an important role. The training procedure included a perception model based on real camera data, which taught the robot to keep the shuttlecock in its field of view while accounting for the noise and resulting object-tracking errors. Once the training was done, the robot learned to position itself on the court. It figured out that the best strategy after a successful return is to move back to the center and toward the backline, which is something human players do. It even came with a trick where it stood on its hind legs to see the incoming shuttlecock better. It also learned fall avoidance and determined how much risk was reasonable to take given its limited speed. The robot did not attempt impossible plays that would create the potential for serious damage -- it was committed, but not suicidal. But when it finally played humans, it turned out ANYmal, as a badminton player, was amateur at best. The findings have been published in the journal Science Robotics. You can watch a video of the four-legged robot playing badminton on YouTube.

Read more of this story at Slashdot.

Syndicate content
Comment