A 2025-2026 trend targeting tourists and expats: scammers use AI tools to clone the voice of a family member, employer, or authority figure from publicly available audio (social media videos, voicemail greetings). They then call the victim pretending to be that person, claiming an emergency that requires an immediate wire transfer. Some scams use real-time deepfake video on LINE or WhatsApp video calls. Others impersonate embassy staff or police officers. The AI-generated speech is often convincing enough to fool victims who are caught off guard, especially across a phone connection. Thailand's police have issued warnings about this trend in early 2026.
Linha direta da Polícia Turística
Liga para o 1155 para assistência em inglês, disponível 24/7