Healthcare Facilities Alerted of 'Scattered Spider' Cyber Threat

The group uses a form of AI technology known as deep fakes, which are convincing yet false recreations that can trick people.

By Jeff Wardon, Jr., Assistant Editor


The Health Sector Cybersecurity Coordination Center (HC3) has issued a warning about the Scattered Spider threat actor, which has targeted organizations in many fields, including healthcare. The group is known for using both legitimate, widely accessible tools and other malware in its schemes, even including many variants of ransomware. They have also added RansomHub and Qilin to their arsenal as of the second quarter of 2024. 

Scattered Spider uses AI tools to mimic victims’ voices for getting initial access to targeted organizations, according to HC3’s warning. This follows the recent trend of “deep fakes” in cyberattacks, where the technology can create realistic – yet false – images, videos or audio recordings, TechTarget reports. When this content is used in cyberattacks, it can trick targets into unwittingly following the commands of a cybercriminal. It is expected that Scattered Spider will continue to grow its technology to further avoid detection. 

With the increase of cybercrimes, healthcare organizations must be aware of any potential threat to survive in the digital age. Detecting whether something is a deep fake or a phishing scam is crucial to protecting healthcare organizations from cyberattacks. TechTarget lists these hints to determining if something is a deep fake: 

  • Facial and body movement: When looking at images and videos, deep fakes can be identified by closely looking at participants’ facial expressions and body language. This is because there may be inconsistencies within a person’s human likeness that AI cannot overcome. 
  • Lip-sync detection: When a video is matched up with altered audio from a spoken voice, there can be mismatched syncing in how words are projected. Paying close attention to lip movements can help bring out these issues. 
  • Inconsistent or lack of blinking: Right now, AI has trouble simulating the blink of an eye, meaning deep fake algorithms can create inconsistent blinking patterns or just not even have it. 
  • Irregular reflections or shadowing: Deep fake algorithms also do a shoddy job of recreating reflections and shadows, so look closely at both on surrounding surfaces, in the backgrounds or in the person’s eyes. 
  • Pupil dilation: Most of the time, AI doesn’t alter the diameter of pupils, creating eyes that will appear off. Pay attention if the person’s pupils aren’t dilating naturally to focusing on objects or adjusting to multiple light sources. 
  • Artificial audio noise: Deep fakes can add artificial noise to audio files to mask any potential changes, which is also known as “artifacting.” 

Jeff Wardon, Jr., is the assistant editor for the facilities market. 



November 1, 2024


Topic Area: Information Technology , Security


Recent Posts

17 Million Patient Records Stolen in PIH Health Ransomware Attack

A ransomware attack halted operations across three of PIH’s hospitals.


Holidays are Prime Times for Healthcare Cyberattacks

A study found that 86 percent of organizations that experienced ransomware attacks were targeted on a holiday or weekend.


Hartford Healthcare Forms Partnership to Open Health Equity Clinic

The new clinic will open in January 2025.


UCHealth Reveals Plans for Memorial Hospital North Expansion

Construction on the patient tower is slated for 2026 with a projected opening to patients in 2029.


What Are 'Hospi-tels'?

Hospitals and hotels are partnering to better cater to patients and families.


 
 


FREE Newsletter Signup Form

News & Updates | Webcast Alerts
Building Technologies | & More!

 
 
 


All fields are required. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

 
 
 
 

Healthcare Facilities Today membership includes free email newsletters from our facility-industry brands.

Facebook   Twitter   LinkedIn   Posts

Copyright © 2023 TradePress. All rights reserved.