• Home
  • News
    • Bitcoin
    • Ethereum
    • Altcoins
    • NFTs
    • Blockchain
    • DeFi
    • AI
    • Policies
  • Market
    • Trends
    • Analysis
  • Interviews
  • Discover
    • For Beginners
    • Tips
  • All Posts
Hot News

Astar Lowers Base Staking Rewards to Mitigate Inflationary Pressure

2025-04-18

Imminent Bitcoin Price Volatility as Speculators Transfer 170K BTC — CryptoQuant

2025-04-18

Spar Supermarket in Switzerland Begins Accepting Bitcoin Payments

2025-04-18
Facebook X (Twitter) Instagram
X (Twitter) Telegram
BlockoalaBlockoala
  • Home
  • News
    • Bitcoin
    • Ethereum
    • Altcoins
    • NFTs
    • Blockchain
    • DeFi
    • AI
    • Policies
  • Market
    • Trends
    • Analysis
  • Interviews
  • Discover
    • For Beginners
    • Tips
  • All Posts
Subscribe
BlockoalaBlockoala
Home » Researchers in the field of cybersecurity reveal an advanced artificial intelligence audio attack known as deepfake, which has the ability to seize control of real-time conversations.
AI

Researchers in the field of cybersecurity reveal an advanced artificial intelligence audio attack known as deepfake, which has the ability to seize control of real-time conversations.

2024-02-05No Comments2 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Researchers in the field of cybersecurity reveal an advanced artificial intelligence audio attack known as deepfake, which has the ability to seize control of real-time conversations.
Researchers in the field of cybersecurity reveal an advanced artificial intelligence audio attack known as deepfake, which has the ability to seize control of real-time conversations.
Share
Facebook Twitter LinkedIn Pinterest Email

IBM Security researchers have recently made a disturbing discovery regarding the ease with which live conversations can be hijacked and manipulated using artificial intelligence (AI). This new attack method, known as “audio-jacking,” utilizes generative AI and deepfake audio technology.
During their experiment, the researchers instructed the AI to process audio from two sources in a live communication, like a phone conversation. Once the AI detected a specific keyword or phrase, it was programmed to intercept the related audio, manipulate it, and then send it to the intended recipient.
According to a blog post by IBM Security, the experiment successfully demonstrated that the AI could intercept a speaker’s audio when prompted by the other person to provide their bank account information. The AI replaced the authentic voice with deepfake audio, providing a different account number. The victims in the experiment remained unaware of the attack.
The blog emphasizes that while executing this attack would require some level of social engineering or phishing, developing the AI system itself presented little challenge. In the past, creating a system capable of autonomously intercepting specific audio and replacing it with dynamically generated audio files would have required a complex computer science effort. However, modern generative AI is capable of handling this process with ease. The blog post states that only three seconds of an individual’s voice are needed to clone it, and the creation of deepfakes can now be done through APIs.
The threat posed by audio-jacking extends beyond tricking unsuspecting individuals into depositing funds into the wrong account. The researchers also highlight the potential for this technique to serve as an invisible form of censorship, allowing for real-time alteration of content in live news broadcasts or political speeches.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Astar Lowers Base Staking Rewards to Mitigate Inflationary Pressure

2025-04-18

Imminent Bitcoin Price Volatility as Speculators Transfer 170K BTC — CryptoQuant

2025-04-18

Spar Supermarket in Switzerland Begins Accepting Bitcoin Payments

2025-04-18

Sygnum Predicts Potential Altcoin Surge in Q2 2025 Due to Enhanced Regulations

2025-04-18
Add A Comment

Leave A Reply Cancel Reply

Editors Picks

Astar Lowers Base Staking Rewards to Mitigate Inflationary Pressure

2025-04-18

Imminent Bitcoin Price Volatility as Speculators Transfer 170K BTC — CryptoQuant

2025-04-18

Spar Supermarket in Switzerland Begins Accepting Bitcoin Payments

2025-04-18

Sygnum Predicts Potential Altcoin Surge in Q2 2025 Due to Enhanced Regulations

2025-04-18
Latest Posts

Astar Lowers Base Staking Rewards to Mitigate Inflationary Pressure

2025-04-18

Imminent Bitcoin Price Volatility as Speculators Transfer 170K BTC — CryptoQuant

2025-04-18

Spar Supermarket in Switzerland Begins Accepting Bitcoin Payments

2025-04-18
Blockoala
X (Twitter) Telegram
  • Home
  • News
  • Market
  • Interviews
  • Discover
  • All Posts
Copyright © 2025 Blockoala. All rights reserved.

Type above and press Enter to search. Press Esc to cancel.