Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex. And if you prefix transactional data as trans_
Copilot will refuse to help you. 😑
Clearly the answer is to write code in emojis that are translated into heiroglyphs then “processed” into Rust. And add a bunch of beloved AI keywords here and there. That way when it learns to block it they’ll inadvertantly block their favorite buzzwords
It’s thinking like this that keeps my hope for technology hanging on by a thread