Digital Homunculus (5/22/24)
TST Radio - A podcast by Ryan Gable
Categories:
The prospect of digital replicas is growing in popularity and extends from voice mirroring scams to fully communicative dead relatives. The FTC warned in march that AI-voice scams are getting more sophisticated and had previously noted that scammers can use the technology to “generate spear-phishing emails, fake websites, fake posts, fake profiles, and fake consumer reviews, or to help create malware, ransomware, and prompt injection attacks. They can also use AI for “imposter scams, extortion, and financial fraud.” The next step up are applications predicting death dates, AI girlfriends, and DNA testing companies, that all require immense personal data to function. Some companies are now offering, for a price, to replicate your loved ones in a digital environment through all their personal data. Researchers from Cambridge are warning that this could cause users to be both digitally haunted and also psychologically damaged or tortured because they will not be fully processing the grief of loss. They say use of such deadbots could result in people feeling as if they are being “stalked by the dead”, especially because of the contracts that must be signed and the use of these replicas to advertise to users as their deceased relative. In essence, we are selling our love ones’ soul so they will never rest in peace. We are sending them to a digital purgatory, seeding and feeding the other side, and manifesting a techno-Tulpa, or digital homunculi.-FREE ARCHIVE & RSS: https://www.spreaker.com/show/the-secret-teachingsTwitter: https://twitter.com/TST___RadioFacebook: https://www.facebook.com/thesecretteachingsWEBSITE (BOOKS, RESUBSCRIBE for early show access): http://thesecretteachings.infoPaypal: [email protected]: $rdgableBuy Me a Coffee: https://www.buymeacoffee.com/tstradioSUBSCRIBE TO NETWORK: http://aftermath.mediaEMAIL: [email protected] / [email protected] a supporter of this podcast: https://www.spreaker.com/podcast/tst-radio--5328407/support.